[ 576.806839] env[60357]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 577.266748] env[60400]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 578.793276] env[60400]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=60400) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 578.793718] env[60400]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=60400) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 578.793718] env[60400]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=60400) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 578.793968] env[60400]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 578.795061] env[60400]: WARNING nova.servicegroup.api [-] Report interval must be less than service down time. Current config: . Setting service_down_time to: 300 [ 578.912169] env[60400]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=60400) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} [ 578.922996] env[60400]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.011s {{(pid=60400) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} [ 579.023652] env[60400]: INFO nova.virt.driver [None req-380d47ea-b44f-4c8f-8f44-66577d18d370 None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 579.099920] env[60400]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 579.100105] env[60400]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 579.100199] env[60400]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=60400) __init__ /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:242}} [ 582.273072] env[60400]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-7234d023-d33f-4cb4-9ddc-a80c265d6597 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 582.289435] env[60400]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=60400) _create_session /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:242}} [ 582.289601] env[60400]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-742fd7a3-37e2-4f07-b95d-142ff4742c24 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 582.322204] env[60400]: INFO oslo_vmware.api [-] Successfully established new session; session ID is dd93e. [ 582.322412] env[60400]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 3.222s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 582.322947] env[60400]: INFO nova.virt.vmwareapi.driver [None req-380d47ea-b44f-4c8f-8f44-66577d18d370 None None] VMware vCenter version: 7.0.3 [ 582.326292] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91fc2d62-de95-4424-8753-cde3968277b2 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 582.344027] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fad5a1d0-2bb5-4403-816b-0f383a2db5bc {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 582.350026] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c31f272b-b0d0-49a9-b032-aad635e92a3b {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 582.356761] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8aebfec8-ac55-44a1-8398-7548f4d36ff9 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 582.369789] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6647093-9c16-461e-bcad-9c467ff133da {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 582.376019] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfd48768-477b-4435-a191-6d71c9386df6 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 582.406782] env[60400]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-3277b8af-b433-431b-8956-90c50e4f9a4c {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 582.412384] env[60400]: DEBUG nova.virt.vmwareapi.driver [None req-380d47ea-b44f-4c8f-8f44-66577d18d370 None None] Extension org.openstack.compute already exists. {{(pid=60400) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:214}} [ 582.415007] env[60400]: INFO nova.compute.provider_config [None req-380d47ea-b44f-4c8f-8f44-66577d18d370 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 582.430912] env[60400]: DEBUG nova.context [None req-380d47ea-b44f-4c8f-8f44-66577d18d370 None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),d0ae54bd-e0f8-43cb-838f-391ff3e34a72(cell1) {{(pid=60400) load_cells /opt/stack/nova/nova/context.py:464}} [ 582.432815] env[60400]: DEBUG oslo_concurrency.lockutils [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 582.433039] env[60400]: DEBUG oslo_concurrency.lockutils [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 582.433843] env[60400]: DEBUG oslo_concurrency.lockutils [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 582.434192] env[60400]: DEBUG oslo_concurrency.lockutils [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 582.434374] env[60400]: DEBUG oslo_concurrency.lockutils [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 582.435384] env[60400]: DEBUG oslo_concurrency.lockutils [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 582.447884] env[60400]: DEBUG oslo_db.sqlalchemy.engines [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=60400) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 582.448354] env[60400]: DEBUG oslo_db.sqlalchemy.engines [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=60400) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 582.455088] env[60400]: ERROR nova.db.main.api [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 582.455088] env[60400]: result = function(*args, **kwargs) [ 582.455088] env[60400]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 582.455088] env[60400]: return func(*args, **kwargs) [ 582.455088] env[60400]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 582.455088] env[60400]: result = fn(*args, **kwargs) [ 582.455088] env[60400]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 582.455088] env[60400]: return f(*args, **kwargs) [ 582.455088] env[60400]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 582.455088] env[60400]: return db.service_get_minimum_version(context, binaries) [ 582.455088] env[60400]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 582.455088] env[60400]: _check_db_access() [ 582.455088] env[60400]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 582.455088] env[60400]: stacktrace = ''.join(traceback.format_stack()) [ 582.455088] env[60400]: [ 582.455746] env[60400]: ERROR nova.db.main.api [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 582.455746] env[60400]: result = function(*args, **kwargs) [ 582.455746] env[60400]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 582.455746] env[60400]: return func(*args, **kwargs) [ 582.455746] env[60400]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 582.455746] env[60400]: result = fn(*args, **kwargs) [ 582.455746] env[60400]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 582.455746] env[60400]: return f(*args, **kwargs) [ 582.455746] env[60400]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 582.455746] env[60400]: return db.service_get_minimum_version(context, binaries) [ 582.455746] env[60400]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 582.455746] env[60400]: _check_db_access() [ 582.455746] env[60400]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 582.455746] env[60400]: stacktrace = ''.join(traceback.format_stack()) [ 582.455746] env[60400]: [ 582.456330] env[60400]: WARNING nova.objects.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] Failed to get minimum service version for cell d0ae54bd-e0f8-43cb-838f-391ff3e34a72 [ 582.456330] env[60400]: WARNING nova.objects.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 582.456632] env[60400]: DEBUG oslo_concurrency.lockutils [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] Acquiring lock "singleton_lock" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 582.456782] env[60400]: DEBUG oslo_concurrency.lockutils [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] Acquired lock "singleton_lock" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 582.457026] env[60400]: DEBUG oslo_concurrency.lockutils [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] Releasing lock "singleton_lock" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 582.457339] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] Full set of CONF: {{(pid=60400) _wait_for_exit_or_signal /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:362}} [ 582.457473] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ******************************************************************************** {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} [ 582.457595] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] Configuration options gathered from: {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} [ 582.457723] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} [ 582.457911] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} [ 582.458042] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ================================================================================ {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} [ 582.458245] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] allow_resize_to_same_host = True {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.458405] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] arq_binding_timeout = 300 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.458529] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] backdoor_port = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.458651] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] backdoor_socket = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.458805] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] block_device_allocate_retries = 60 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.458957] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] block_device_allocate_retries_interval = 3 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.459128] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cert = self.pem {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.459286] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.459463] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] compute_monitors = [] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.459634] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] config_dir = [] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.459798] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] config_drive_format = iso9660 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.459929] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.460103] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] config_source = [] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.460263] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] console_host = devstack {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.460422] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] control_exchange = nova {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.460592] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cpu_allocation_ratio = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.460748] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] daemon = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.460906] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] debug = True {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.461065] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] default_access_ip_network_name = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.461224] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] default_availability_zone = nova {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.461369] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] default_ephemeral_format = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.461593] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.461755] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] default_schedule_zone = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.461928] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] disk_allocation_ratio = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.462102] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] enable_new_services = True {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.462275] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] enabled_apis = ['osapi_compute'] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.462431] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] enabled_ssl_apis = [] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.462583] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] flat_injected = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.462734] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] force_config_drive = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.462883] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] force_raw_images = True {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.463072] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] graceful_shutdown_timeout = 5 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.463239] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] heal_instance_info_cache_interval = 60 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.463443] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] host = cpu-1 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.463607] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.463764] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] initial_disk_allocation_ratio = 1.0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.463916] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] initial_ram_allocation_ratio = 1.0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.464258] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.464519] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] instance_build_timeout = 0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.464777] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] instance_delete_interval = 300 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.465057] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] instance_format = [instance: %(uuid)s] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.465304] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] instance_name_template = instance-%08x {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.465481] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] instance_usage_audit = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.465652] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] instance_usage_audit_period = month {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.465813] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.465971] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] instances_path = /opt/stack/data/nova/instances {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.466141] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] internal_service_availability_zone = internal {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.466289] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] key = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.466443] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] live_migration_retry_count = 30 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.466601] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] log_config_append = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.466755] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.466902] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] log_dir = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.467058] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] log_file = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.467176] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] log_options = True {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.467328] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] log_rotate_interval = 1 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.467485] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] log_rotate_interval_type = days {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.467642] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] log_rotation_type = none {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.467762] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.467881] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.468045] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.468226] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.468348] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.468470] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] long_rpc_timeout = 1800 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.468619] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] max_concurrent_builds = 10 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.468767] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] max_concurrent_live_migrations = 1 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.468916] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] max_concurrent_snapshots = 5 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.469078] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] max_local_block_devices = 3 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.469234] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] max_logfile_count = 30 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.469383] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] max_logfile_size_mb = 200 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.469557] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] maximum_instance_delete_attempts = 5 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.469721] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] metadata_listen = 0.0.0.0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.469878] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] metadata_listen_port = 8775 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.470048] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] metadata_workers = 2 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.470205] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] migrate_max_retries = -1 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.470361] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] mkisofs_cmd = genisoimage {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.470558] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] my_block_storage_ip = 10.180.1.21 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.470682] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] my_ip = 10.180.1.21 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.470836] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] network_allocate_retries = 0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.471017] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.471174] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] osapi_compute_listen = 0.0.0.0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.471326] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] osapi_compute_listen_port = 8774 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.471481] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] osapi_compute_unique_server_name_scope = {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.471641] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] osapi_compute_workers = 2 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.471809] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] password_length = 12 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.471972] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] periodic_enable = True {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.472140] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] periodic_fuzzy_delay = 60 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.472300] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] pointer_model = usbtablet {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.472457] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] preallocate_images = none {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.472607] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] publish_errors = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.472727] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] pybasedir = /opt/stack/nova {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.472884] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ram_allocation_ratio = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.473035] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] rate_limit_burst = 0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.473195] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] rate_limit_except_level = CRITICAL {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.473346] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] rate_limit_interval = 0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.473497] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] reboot_timeout = 0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.473647] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] reclaim_instance_interval = 0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.473793] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] record = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.473955] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] reimage_timeout_per_gb = 60 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.474124] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] report_interval = 120 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.474278] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] rescue_timeout = 0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.474433] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] reserved_host_cpus = 0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.474587] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] reserved_host_disk_mb = 0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.474737] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] reserved_host_memory_mb = 512 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.474912] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] reserved_huge_pages = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.475095] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] resize_confirm_window = 0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.475251] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] resize_fs_using_block_device = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.475405] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] resume_guests_state_on_host_boot = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.475563] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.475716] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] rpc_response_timeout = 60 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.475865] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] run_external_periodic_tasks = True {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.476031] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] running_deleted_instance_action = reap {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.476189] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] running_deleted_instance_poll_interval = 1800 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.476339] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] running_deleted_instance_timeout = 0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.476491] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] scheduler_instance_sync_interval = 120 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.476620] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] service_down_time = 300 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.476781] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] servicegroup_driver = db {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.476933] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] shelved_offload_time = 0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.477092] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] shelved_poll_interval = 3600 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.477252] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] shutdown_timeout = 0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.477404] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] source_is_ipv6 = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.477554] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ssl_only = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.477792] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.477972] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] sync_power_state_interval = 600 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.478155] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] sync_power_state_pool_size = 1000 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.478313] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] syslog_log_facility = LOG_USER {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.478461] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] tempdir = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.478614] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] timeout_nbd = 10 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.478773] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] transport_url = **** {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.478924] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] update_resources_interval = 0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.479087] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] use_cow_images = True {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.479240] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] use_eventlog = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.479391] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] use_journal = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.479570] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] use_json = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.479735] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] use_rootwrap_daemon = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.479882] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] use_stderr = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.480041] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] use_syslog = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.480192] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vcpu_pin_set = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.480349] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vif_plugging_is_fatal = True {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.480507] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vif_plugging_timeout = 300 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.480693] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] virt_mkfs = [] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.480861] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] volume_usage_poll_interval = 0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.481022] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] watch_log_file = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.481186] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] web = /usr/share/spice-html5 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 582.481365] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_concurrency.disable_process_locking = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.481655] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.481843] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.482012] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.482183] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.482344] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.482503] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.482679] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api.auth_strategy = keystone {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.482839] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api.compute_link_prefix = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.483009] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.483182] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api.dhcp_domain = novalocal {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.483343] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api.enable_instance_password = True {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.483501] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api.glance_link_prefix = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.483660] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.483825] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.483979] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api.instance_list_per_project_cells = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.484143] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api.list_records_by_skipping_down_cells = True {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.484298] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api.local_metadata_per_cell = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.484684] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api.max_limit = 1000 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.484684] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api.metadata_cache_expiration = 15 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.484775] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api.neutron_default_tenant_id = default {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.484926] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api.use_forwarded_for = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.485091] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api.use_neutron_default_nets = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.485256] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.485418] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.485634] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.485813] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.485980] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api.vendordata_dynamic_targets = [] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.486155] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api.vendordata_jsonfile_path = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.486329] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.486514] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cache.backend = dogpile.cache.memcached {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.486676] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cache.backend_argument = **** {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.486838] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cache.config_prefix = cache.oslo {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.486996] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cache.dead_timeout = 60.0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.487166] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cache.debug_cache_backend = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.487320] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cache.enable_retry_client = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.487475] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cache.enable_socket_keepalive = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.487640] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cache.enabled = True {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.487797] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cache.expiration_time = 600 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.487952] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cache.hashclient_retry_attempts = 2 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.488124] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cache.hashclient_retry_delay = 1.0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.488279] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cache.memcache_dead_retry = 300 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.488440] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cache.memcache_password = {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.488598] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.488754] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.488910] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cache.memcache_pool_maxsize = 10 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.489074] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.489232] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cache.memcache_sasl_enabled = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.489403] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.489595] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cache.memcache_socket_timeout = 1.0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.489764] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cache.memcache_username = {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.489923] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cache.proxies = [] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.490091] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cache.retry_attempts = 2 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.490251] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cache.retry_delay = 0.0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.490414] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cache.socket_keepalive_count = 1 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.490593] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cache.socket_keepalive_idle = 1 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.490765] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cache.socket_keepalive_interval = 1 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.490919] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cache.tls_allowed_ciphers = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.491082] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cache.tls_cafile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.491240] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cache.tls_certfile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.491396] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cache.tls_enabled = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.491553] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cache.tls_keyfile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.491708] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cinder.auth_section = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.491873] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cinder.auth_type = password {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.492034] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cinder.cafile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.492205] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cinder.catalog_info = volumev3::publicURL {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.492356] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cinder.certfile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.492511] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cinder.collect_timing = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.492665] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cinder.cross_az_attach = True {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.492819] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cinder.debug = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.492970] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cinder.endpoint_template = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.493135] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cinder.http_retries = 3 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.493292] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cinder.insecure = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.493443] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cinder.keyfile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.493608] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cinder.os_region_name = RegionOne {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.493764] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cinder.split_loggers = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.493917] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cinder.timeout = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.494092] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.494247] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] compute.cpu_dedicated_set = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.494394] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] compute.cpu_shared_set = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.494552] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] compute.image_type_exclude_list = [] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.494712] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.494867] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] compute.max_concurrent_disk_ops = 0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.495032] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] compute.max_disk_devices_to_attach = -1 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.495191] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.495352] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.495508] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] compute.resource_provider_association_refresh = 300 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.495665] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] compute.shutdown_retry_interval = 10 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.495837] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.496014] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] conductor.workers = 2 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.496184] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] console.allowed_origins = [] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.496335] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] console.ssl_ciphers = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.496497] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] console.ssl_minimum_version = default {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.496664] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] consoleauth.token_ttl = 600 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.496828] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cyborg.cafile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.496978] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cyborg.certfile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.497145] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cyborg.collect_timing = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.497296] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cyborg.connect_retries = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.497481] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cyborg.connect_retry_delay = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.497633] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cyborg.endpoint_override = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.497793] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cyborg.insecure = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.497942] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cyborg.keyfile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.498104] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cyborg.max_version = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.498255] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cyborg.min_version = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.498405] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cyborg.region_name = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.498551] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cyborg.service_name = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.498723] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cyborg.service_type = accelerator {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.498876] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cyborg.split_loggers = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.499037] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cyborg.status_code_retries = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.499190] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cyborg.status_code_retry_delay = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.499340] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cyborg.timeout = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.499541] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.499703] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] cyborg.version = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.499881] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] database.backend = sqlalchemy {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.500061] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] database.connection = **** {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.500227] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] database.connection_debug = 0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.500387] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] database.connection_parameters = {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.500569] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] database.connection_recycle_time = 3600 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.500739] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] database.connection_trace = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.500896] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] database.db_inc_retry_interval = True {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.501063] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] database.db_max_retries = 20 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.501221] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] database.db_max_retry_interval = 10 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.501379] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] database.db_retry_interval = 1 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.501543] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] database.max_overflow = 50 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.501701] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] database.max_pool_size = 5 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.501861] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] database.max_retries = 10 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.502024] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] database.mysql_enable_ndb = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.502206] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.502422] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] database.mysql_wsrep_sync_wait = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.502604] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] database.pool_timeout = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.502830] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] database.retry_interval = 10 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.503021] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] database.slave_connection = **** {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.503244] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] database.sqlite_synchronous = True {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.503463] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] database.use_db_reconnect = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.503710] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api_database.backend = sqlalchemy {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.503890] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api_database.connection = **** {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.504067] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api_database.connection_debug = 0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.504236] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api_database.connection_parameters = {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.504398] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api_database.connection_recycle_time = 3600 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.504584] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api_database.connection_trace = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.504821] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api_database.db_inc_retry_interval = True {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.505095] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api_database.db_max_retries = 20 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.505285] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api_database.db_max_retry_interval = 10 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.505449] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api_database.db_retry_interval = 1 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.505618] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api_database.max_overflow = 50 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.505776] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api_database.max_pool_size = 5 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.505964] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api_database.max_retries = 10 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.506131] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api_database.mysql_enable_ndb = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.506298] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.506469] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.506663] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api_database.pool_timeout = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.507158] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api_database.retry_interval = 10 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.507353] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api_database.slave_connection = **** {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.507509] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] api_database.sqlite_synchronous = True {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.507684] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] devices.enabled_mdev_types = [] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.507862] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.508077] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ephemeral_storage_encryption.enabled = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.508259] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.508432] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] glance.api_servers = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.508596] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] glance.cafile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.508754] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] glance.certfile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.508915] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] glance.collect_timing = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.509079] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] glance.connect_retries = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.509240] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] glance.connect_retry_delay = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.509466] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] glance.debug = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.509637] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] glance.default_trusted_certificate_ids = [] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.509803] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] glance.enable_certificate_validation = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.509961] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] glance.enable_rbd_download = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.510126] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] glance.endpoint_override = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.510286] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] glance.insecure = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.510462] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] glance.keyfile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.510737] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] glance.max_version = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.510786] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] glance.min_version = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.510953] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] glance.num_retries = 3 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.511204] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] glance.rbd_ceph_conf = {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.511388] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] glance.rbd_connect_timeout = 5 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.511559] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] glance.rbd_pool = {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.511736] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] glance.rbd_user = {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.511914] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] glance.region_name = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.512734] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] glance.service_name = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.512734] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] glance.service_type = image {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.512734] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] glance.split_loggers = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.512734] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] glance.status_code_retries = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.512734] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] glance.status_code_retry_delay = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.512874] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] glance.timeout = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.513067] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.513235] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] glance.verify_glance_signatures = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.513401] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] glance.version = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.513546] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] guestfs.debug = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.513713] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] hyperv.config_drive_cdrom = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.513893] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] hyperv.config_drive_inject_password = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.514083] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.514248] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] hyperv.enable_instance_metrics_collection = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.514408] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] hyperv.enable_remotefx = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.514575] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] hyperv.instances_path_share = {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.514740] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] hyperv.iscsi_initiator_list = [] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.514900] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] hyperv.limit_cpu_features = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.515071] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.515231] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.515459] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] hyperv.power_state_check_timeframe = 60 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.515638] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.515805] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.515965] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] hyperv.use_multipath_io = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.516137] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] hyperv.volume_attach_retry_count = 10 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.516295] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.516450] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] hyperv.vswitch_name = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.516607] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.516822] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] mks.enabled = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.517228] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.517456] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] image_cache.manager_interval = 2400 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.517630] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] image_cache.precache_concurrency = 1 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.517798] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] image_cache.remove_unused_base_images = True {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.517962] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.518144] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.518352] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] image_cache.subdirectory_name = _base {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.518554] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ironic.api_max_retries = 60 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.518724] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ironic.api_retry_interval = 2 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.518879] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ironic.auth_section = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.519051] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ironic.auth_type = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.519212] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ironic.cafile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.519366] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ironic.certfile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.519559] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ironic.collect_timing = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.519726] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ironic.connect_retries = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.519916] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ironic.connect_retry_delay = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.520113] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ironic.endpoint_override = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.520289] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ironic.insecure = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.520453] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ironic.keyfile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.520605] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ironic.max_version = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.520759] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ironic.min_version = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.520964] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ironic.partition_key = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.521086] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ironic.peer_list = [] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.521252] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ironic.region_name = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.521447] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ironic.serial_console_state_timeout = 10 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.521623] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ironic.service_name = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.521796] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ironic.service_type = baremetal {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.521948] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ironic.split_loggers = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.522115] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ironic.status_code_retries = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.522270] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ironic.status_code_retry_delay = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.522424] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ironic.timeout = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.522599] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.522757] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ironic.version = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.522964] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.523182] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] key_manager.fixed_key = **** {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.523334] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.523488] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] barbican.barbican_api_version = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.523641] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] barbican.barbican_endpoint = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.523813] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] barbican.barbican_endpoint_type = public {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.524006] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] barbican.barbican_region_name = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.524170] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] barbican.cafile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.524352] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] barbican.certfile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.524523] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] barbican.collect_timing = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.524684] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] barbican.insecure = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.524837] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] barbican.keyfile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.524994] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] barbican.number_of_retries = 60 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.525165] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] barbican.retry_delay = 1 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.525323] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] barbican.send_service_user_token = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.525479] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] barbican.split_loggers = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.525636] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] barbican.timeout = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.525867] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] barbican.verify_ssl = True {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.526126] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] barbican.verify_ssl_path = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.526317] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] barbican_service_user.auth_section = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.526478] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] barbican_service_user.auth_type = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.526636] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] barbican_service_user.cafile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.526786] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] barbican_service_user.certfile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.526943] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] barbican_service_user.collect_timing = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.527119] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] barbican_service_user.insecure = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.527313] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] barbican_service_user.keyfile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.527497] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] barbican_service_user.split_loggers = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.527650] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] barbican_service_user.timeout = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.527811] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vault.approle_role_id = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.527964] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vault.approle_secret_id = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.528138] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vault.cafile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.528290] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vault.certfile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.528453] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vault.collect_timing = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.528616] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vault.insecure = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.528913] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vault.keyfile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.528979] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vault.kv_mountpoint = secret {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.529149] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vault.kv_version = 2 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.529301] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vault.namespace = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.529502] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vault.root_token_id = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.529678] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vault.split_loggers = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.529834] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vault.ssl_ca_crt_file = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.529987] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vault.timeout = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.530194] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vault.use_ssl = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.530373] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.530555] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] keystone.cafile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.530725] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] keystone.certfile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.530888] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] keystone.collect_timing = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.531054] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] keystone.connect_retries = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.531210] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] keystone.connect_retry_delay = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.531360] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] keystone.endpoint_override = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.531513] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] keystone.insecure = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.531689] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] keystone.keyfile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.531877] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] keystone.max_version = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.532052] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] keystone.min_version = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.532209] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] keystone.region_name = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.532360] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] keystone.service_name = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.532527] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] keystone.service_type = identity {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.532683] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] keystone.split_loggers = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.532833] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] keystone.status_code_retries = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.533029] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] keystone.status_code_retry_delay = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.533204] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] keystone.timeout = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.533380] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.533536] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] keystone.version = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.533728] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.connection_uri = {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.533881] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.cpu_mode = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.534052] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.cpu_model_extra_flags = [] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.534216] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.cpu_models = [] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.534392] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.cpu_power_governor_high = performance {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.534582] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.cpu_power_governor_low = powersave {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.534745] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.cpu_power_management = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.534909] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.535081] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.device_detach_attempts = 8 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.535241] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.device_detach_timeout = 20 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.535400] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.disk_cachemodes = [] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.535553] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.disk_prefix = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.535713] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.enabled_perf_events = [] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.535919] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.file_backed_memory = 0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.536081] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.gid_maps = [] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.536250] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.hw_disk_discard = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.536403] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.hw_machine_type = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.536568] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.images_rbd_ceph_conf = {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.536730] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.536941] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.537147] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.images_rbd_glance_store_name = {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.537317] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.images_rbd_pool = rbd {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.537490] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.images_type = default {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.537641] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.images_volume_group = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.537799] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.inject_key = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.537955] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.inject_partition = -2 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.538124] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.inject_password = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.538282] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.iscsi_iface = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.538439] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.iser_use_multipath = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.538597] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.live_migration_bandwidth = 0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.538755] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.538912] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.live_migration_downtime = 500 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.539078] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.539236] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.539388] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.live_migration_inbound_addr = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.539568] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.539731] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.live_migration_permit_post_copy = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.539890] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.live_migration_scheme = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.540067] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.live_migration_timeout_action = abort {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.540228] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.live_migration_tunnelled = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.540382] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.live_migration_uri = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.540546] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.live_migration_with_native_tls = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.540724] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.max_queues = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.540886] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.541051] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.nfs_mount_options = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.541361] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.541559] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.541728] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.num_iser_scan_tries = 5 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.541886] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.num_memory_encrypted_guests = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.542058] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.542221] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.num_pcie_ports = 0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.542379] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.num_volume_scan_tries = 5 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.542539] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.pmem_namespaces = [] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.542693] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.quobyte_client_cfg = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.542982] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.543162] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.rbd_connect_timeout = 5 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.543322] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.543479] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.543637] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.rbd_secret_uuid = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.543787] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.rbd_user = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.543974] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.544154] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.remote_filesystem_transport = ssh {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.544306] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.rescue_image_id = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.544456] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.rescue_kernel_id = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.544607] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.rescue_ramdisk_id = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.544768] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.544918] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.rx_queue_size = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.545085] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.smbfs_mount_options = {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.545351] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.545514] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.snapshot_compression = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.545668] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.snapshot_image_format = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.545878] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.546050] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.sparse_logical_volumes = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.546209] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.swtpm_enabled = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.546370] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.swtpm_group = tss {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.546528] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.swtpm_user = tss {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.546692] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.sysinfo_serial = unique {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.546841] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.tx_queue_size = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.546996] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.uid_maps = [] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.547164] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.use_virtio_for_bridges = True {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.547325] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.virt_type = kvm {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.547483] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.volume_clear = zero {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.547638] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.volume_clear_size = 0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.547792] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.volume_use_multipath = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.547964] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.vzstorage_cache_path = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.548145] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.548307] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.vzstorage_mount_group = qemu {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.548468] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.vzstorage_mount_opts = [] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.548630] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.549150] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.549150] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.vzstorage_mount_user = stack {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.549259] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.549393] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] neutron.auth_section = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.549582] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] neutron.auth_type = password {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.549743] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] neutron.cafile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.549897] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] neutron.certfile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.550065] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] neutron.collect_timing = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.550223] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] neutron.connect_retries = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.550375] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] neutron.connect_retry_delay = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.550550] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] neutron.default_floating_pool = public {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.550719] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] neutron.endpoint_override = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.550878] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] neutron.extension_sync_interval = 600 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.551044] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] neutron.http_retries = 3 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.551205] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] neutron.insecure = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.551355] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] neutron.keyfile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.551505] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] neutron.max_version = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.551667] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.551852] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] neutron.min_version = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.552038] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] neutron.ovs_bridge = br-int {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.552205] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] neutron.physnets = [] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.552370] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] neutron.region_name = RegionOne {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.552534] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] neutron.service_metadata_proxy = True {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.552707] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] neutron.service_name = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.552855] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] neutron.service_type = network {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.553015] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] neutron.split_loggers = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.553173] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] neutron.status_code_retries = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.553323] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] neutron.status_code_retry_delay = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.553474] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] neutron.timeout = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.553650] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.553803] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] neutron.version = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.553969] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] notifications.bdms_in_notifications = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.554149] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] notifications.default_level = INFO {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.554315] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] notifications.notification_format = unversioned {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.554470] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] notifications.notify_on_state_change = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.554638] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.554809] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] pci.alias = [] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.554971] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] pci.device_spec = [] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.555142] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] pci.report_in_placement = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.555308] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.auth_section = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.555482] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.auth_type = password {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.555644] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.auth_url = http://10.180.1.21/identity {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.555798] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.cafile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.555979] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.certfile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.556158] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.collect_timing = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.556312] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.connect_retries = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.556466] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.connect_retry_delay = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.556620] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.default_domain_id = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.556770] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.default_domain_name = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.556920] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.domain_id = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.557079] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.domain_name = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.557233] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.endpoint_override = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.557388] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.insecure = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.557539] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.keyfile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.557690] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.max_version = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.557837] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.min_version = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.557996] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.password = **** {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.558161] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.project_domain_id = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.558319] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.project_domain_name = Default {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.558481] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.project_id = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.558650] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.project_name = service {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.558814] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.region_name = RegionOne {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.558967] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.service_name = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.559139] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.service_type = placement {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.559297] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.split_loggers = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.559477] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.status_code_retries = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.559625] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.status_code_retry_delay = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.559780] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.system_scope = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.559963] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.timeout = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.560140] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.trust_id = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.560293] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.user_domain_id = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.560455] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.user_domain_name = Default {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.560641] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.user_id = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.560813] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.username = placement {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.560990] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.561158] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] placement.version = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.561328] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] quota.cores = 20 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.561488] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] quota.count_usage_from_placement = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.561658] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.561832] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] quota.injected_file_content_bytes = 10240 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.561987] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] quota.injected_file_path_length = 255 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.562157] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] quota.injected_files = 5 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.562315] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] quota.instances = 10 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.562474] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] quota.key_pairs = 100 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.562633] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] quota.metadata_items = 128 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.562792] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] quota.ram = 51200 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.562948] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] quota.recheck_quota = True {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.563122] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] quota.server_group_members = 10 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.563286] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] quota.server_groups = 10 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.563447] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] rdp.enabled = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.563750] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.563957] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.564139] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.564298] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] scheduler.image_metadata_prefilter = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.564452] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.564611] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] scheduler.max_attempts = 3 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.564766] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] scheduler.max_placement_results = 1000 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.564929] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.565092] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] scheduler.query_placement_for_availability_zone = True {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.565251] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] scheduler.query_placement_for_image_type_support = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.565402] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.565571] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] scheduler.workers = 2 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.565743] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.565908] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.566087] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.566252] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.566412] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.566569] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.566729] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.566910] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.567080] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] filter_scheduler.host_subset_size = 1 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.567239] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.567396] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.567557] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] filter_scheduler.isolated_hosts = [] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.567716] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] filter_scheduler.isolated_images = [] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.567874] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.568146] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.568229] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] filter_scheduler.pci_in_placement = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.568388] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.568541] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.568695] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.568848] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.569010] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.569194] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.569357] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] filter_scheduler.track_instance_changes = True {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.569571] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.569724] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] metrics.required = True {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.569882] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] metrics.weight_multiplier = 1.0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.570047] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.570209] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] metrics.weight_setting = [] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.570543] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.570710] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] serial_console.enabled = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.570884] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] serial_console.port_range = 10000:20000 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.571057] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.571220] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.571381] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] serial_console.serialproxy_port = 6083 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.571539] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] service_user.auth_section = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.571711] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] service_user.auth_type = password {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.571891] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] service_user.cafile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.572060] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] service_user.certfile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.572219] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] service_user.collect_timing = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.572372] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] service_user.insecure = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.572522] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] service_user.keyfile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.572684] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] service_user.send_service_user_token = True {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.572837] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] service_user.split_loggers = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.572987] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] service_user.timeout = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.573160] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] spice.agent_enabled = True {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.573326] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] spice.enabled = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.573602] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.573787] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.573952] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] spice.html5proxy_port = 6082 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.574120] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] spice.image_compression = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.574274] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] spice.jpeg_compression = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.574425] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] spice.playback_compression = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.574588] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] spice.server_listen = 127.0.0.1 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.574751] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.574903] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] spice.streaming_mode = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.575064] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] spice.zlib_compression = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.575225] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] upgrade_levels.baseapi = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.575392] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] upgrade_levels.cert = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.575543] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] upgrade_levels.compute = auto {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.575697] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] upgrade_levels.conductor = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.575863] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] upgrade_levels.scheduler = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.576057] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vendordata_dynamic_auth.auth_section = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.576219] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vendordata_dynamic_auth.auth_type = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.576371] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vendordata_dynamic_auth.cafile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.576522] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vendordata_dynamic_auth.certfile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.576677] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.576831] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vendordata_dynamic_auth.insecure = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.576982] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vendordata_dynamic_auth.keyfile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.577170] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.577325] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vendordata_dynamic_auth.timeout = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.577491] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vmware.api_retry_count = 10 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.577647] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vmware.ca_file = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.577810] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vmware.cache_prefix = devstack-image-cache {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.577967] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vmware.cluster_name = testcl1 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.578137] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vmware.connection_pool_size = 10 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.578288] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vmware.console_delay_seconds = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.578447] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vmware.datastore_regex = ^datastore.* {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.578644] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.578806] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vmware.host_password = **** {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.578965] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vmware.host_port = 443 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.579136] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vmware.host_username = administrator@vsphere.local {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.579296] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vmware.insecure = True {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.579474] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vmware.integration_bridge = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.579661] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vmware.maximum_objects = 100 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.579786] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vmware.pbm_default_policy = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.579966] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vmware.pbm_enabled = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.580150] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vmware.pbm_wsdl_location = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.580317] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.580551] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vmware.serial_port_proxy_uri = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.580719] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vmware.serial_port_service_uri = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.580899] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vmware.task_poll_interval = 0.5 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.581089] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vmware.use_linked_clone = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.581275] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vmware.vnc_keymap = en-us {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.581441] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vmware.vnc_port = 5900 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.581605] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vmware.vnc_port_total = 10000 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.581793] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vnc.auth_schemes = ['none'] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.581958] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vnc.enabled = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.582251] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.582431] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.582599] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vnc.novncproxy_port = 6080 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.582768] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vnc.server_listen = 127.0.0.1 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.582937] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.583104] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vnc.vencrypt_ca_certs = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.583261] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vnc.vencrypt_client_cert = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.583413] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vnc.vencrypt_client_key = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.583579] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.583739] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.583923] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] workarounds.disable_group_policy_check_upcall = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.584099] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.584256] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] workarounds.disable_rootwrap = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.584410] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] workarounds.enable_numa_live_migration = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.584566] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.584732] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.584941] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.585123] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] workarounds.libvirt_disable_apic = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.585281] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.585438] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.585665] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.585747] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.585899] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.586063] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.586220] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.586381] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.586548] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.586972] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.587047] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.587192] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] wsgi.client_socket_timeout = 900 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.587355] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] wsgi.default_pool_size = 1000 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.587517] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] wsgi.keep_alive = True {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.587686] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] wsgi.max_header_line = 16384 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.587838] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] wsgi.secure_proxy_ssl_header = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.587995] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] wsgi.ssl_ca_file = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.588170] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] wsgi.ssl_cert_file = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.588321] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] wsgi.ssl_key_file = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.588481] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] wsgi.tcp_keepidle = 600 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.588651] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.588810] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] zvm.ca_file = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.588963] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] zvm.cloud_connector_url = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.589265] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.589459] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] zvm.reachable_timeout = 300 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.589639] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_policy.enforce_new_defaults = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.589809] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_policy.enforce_scope = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.589980] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_policy.policy_default_rule = default {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.590169] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.590339] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_policy.policy_file = policy.yaml {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.590515] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.590669] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.590823] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.590973] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.591142] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.591304] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.591472] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.591650] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] profiler.connection_string = messaging:// {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.591831] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] profiler.enabled = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.592017] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] profiler.es_doc_type = notification {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.592174] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] profiler.es_scroll_size = 10000 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.592334] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] profiler.es_scroll_time = 2m {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.592499] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] profiler.filter_error_trace = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.592663] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] profiler.hmac_keys = SECRET_KEY {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.592822] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] profiler.sentinel_service_name = mymaster {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.592988] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] profiler.socket_timeout = 0.1 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.593158] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] profiler.trace_sqlalchemy = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.593320] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] remote_debug.host = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.593475] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] remote_debug.port = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.593716] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.593812] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.593968] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.594136] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.594291] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.594442] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.594606] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.594747] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.594901] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.595063] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.595227] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.595386] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.595546] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.595706] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.595876] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.596069] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.596234] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.596391] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.596560] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.596722] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.596877] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.597048] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.597207] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.597369] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.597524] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.597683] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_rabbit.ssl = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.597845] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.598013] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.598173] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.598335] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.598508] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_rabbit.ssl_version = {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.598711] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.598872] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_notifications.retry = -1 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.599065] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.599234] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_messaging_notifications.transport_url = **** {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.599396] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_limit.auth_section = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.599578] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_limit.auth_type = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.599737] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_limit.cafile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.599888] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_limit.certfile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.600055] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_limit.collect_timing = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.600237] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_limit.connect_retries = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.600398] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_limit.connect_retry_delay = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.600557] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_limit.endpoint_id = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.600704] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_limit.endpoint_override = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.600868] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_limit.insecure = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.601034] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_limit.keyfile = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.601191] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_limit.max_version = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.601340] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_limit.min_version = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.601492] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_limit.region_name = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.601679] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_limit.service_name = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.601811] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_limit.service_type = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.601982] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_limit.split_loggers = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.602150] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_limit.status_code_retries = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.602303] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_limit.status_code_retry_delay = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.602453] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_limit.timeout = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.602609] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_limit.valid_interfaces = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.602760] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_limit.version = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.602919] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_reports.file_event_handler = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.603091] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.603248] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] oslo_reports.log_dir = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.603410] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.603572] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vif_plug_linux_bridge_privileged.group = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.603740] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.603937] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.604125] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.604282] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vif_plug_linux_bridge_privileged.user = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.604450] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.604594] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vif_plug_ovs_privileged.group = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.604745] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vif_plug_ovs_privileged.helper_command = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.604900] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.605064] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.605215] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] vif_plug_ovs_privileged.user = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.605375] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] os_vif_linux_bridge.flat_interface = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.605546] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.605710] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.605871] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.606042] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.606205] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.606364] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.606519] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] os_vif_linux_bridge.vlan_interface = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.606691] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] os_vif_ovs.isolate_vif = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.606852] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.607030] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.607208] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.607373] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] os_vif_ovs.ovsdb_interface = native {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.607531] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] os_vif_ovs.per_port_bridge = False {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.607693] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] os_brick.lock_path = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.607861] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] privsep_osbrick.capabilities = [21] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.608047] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] privsep_osbrick.group = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.608207] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] privsep_osbrick.helper_command = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.608367] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.608525] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] privsep_osbrick.thread_pool_size = 8 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.608683] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] privsep_osbrick.user = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.608842] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.608992] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] nova_sys_admin.group = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.609155] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] nova_sys_admin.helper_command = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.609312] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.609484] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] nova_sys_admin.thread_pool_size = 8 {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.609657] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] nova_sys_admin.user = None {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 582.609786] env[60400]: DEBUG oslo_service.service [None req-47482f50-ebbd-4207-b4f8-9381f262e1d4 None None] ******************************************************************************** {{(pid=60400) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} [ 582.610203] env[60400]: INFO nova.service [-] Starting compute node (version 0.1.0) [ 582.620410] env[60400]: INFO nova.virt.node [None req-bd56da70-cd1c-4aff-bd49-83fd5fed980f None None] Generated node identity a29934a0-6a74-4b6e-8edf-44d7a53db1dc [ 582.620668] env[60400]: INFO nova.virt.node [None req-bd56da70-cd1c-4aff-bd49-83fd5fed980f None None] Wrote node identity a29934a0-6a74-4b6e-8edf-44d7a53db1dc to /opt/stack/data/n-cpu-1/compute_id [ 582.632711] env[60400]: WARNING nova.compute.manager [None req-bd56da70-cd1c-4aff-bd49-83fd5fed980f None None] Compute nodes ['a29934a0-6a74-4b6e-8edf-44d7a53db1dc'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 582.664021] env[60400]: INFO nova.compute.manager [None req-bd56da70-cd1c-4aff-bd49-83fd5fed980f None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 582.683977] env[60400]: WARNING nova.compute.manager [None req-bd56da70-cd1c-4aff-bd49-83fd5fed980f None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 582.684222] env[60400]: DEBUG oslo_concurrency.lockutils [None req-bd56da70-cd1c-4aff-bd49-83fd5fed980f None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 582.684425] env[60400]: DEBUG oslo_concurrency.lockutils [None req-bd56da70-cd1c-4aff-bd49-83fd5fed980f None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 582.684559] env[60400]: DEBUG oslo_concurrency.lockutils [None req-bd56da70-cd1c-4aff-bd49-83fd5fed980f None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 582.684744] env[60400]: DEBUG nova.compute.resource_tracker [None req-bd56da70-cd1c-4aff-bd49-83fd5fed980f None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60400) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 582.685831] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fd9a683-8dc6-43f2-8962-8ee14a11b492 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 582.694758] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e08ade9b-9729-4250-8889-3e20ca79ff9f {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 582.708812] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87b24b48-22da-42b6-8b35-ac0df37019e1 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 582.715404] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4d15669-3db7-4d19-82ef-8594e8320a9b {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 582.744238] env[60400]: DEBUG nova.compute.resource_tracker [None req-bd56da70-cd1c-4aff-bd49-83fd5fed980f None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181792MB free_disk=118GB free_vcpus=48 pci_devices=None {{(pid=60400) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 582.744390] env[60400]: DEBUG oslo_concurrency.lockutils [None req-bd56da70-cd1c-4aff-bd49-83fd5fed980f None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 582.744566] env[60400]: DEBUG oslo_concurrency.lockutils [None req-bd56da70-cd1c-4aff-bd49-83fd5fed980f None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 582.756710] env[60400]: WARNING nova.compute.resource_tracker [None req-bd56da70-cd1c-4aff-bd49-83fd5fed980f None None] No compute node record for cpu-1:a29934a0-6a74-4b6e-8edf-44d7a53db1dc: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host a29934a0-6a74-4b6e-8edf-44d7a53db1dc could not be found. [ 582.769992] env[60400]: INFO nova.compute.resource_tracker [None req-bd56da70-cd1c-4aff-bd49-83fd5fed980f None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: a29934a0-6a74-4b6e-8edf-44d7a53db1dc [ 582.816016] env[60400]: DEBUG nova.compute.resource_tracker [None req-bd56da70-cd1c-4aff-bd49-83fd5fed980f None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=60400) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 582.816138] env[60400]: DEBUG nova.compute.resource_tracker [None req-bd56da70-cd1c-4aff-bd49-83fd5fed980f None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=60400) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 582.913311] env[60400]: INFO nova.scheduler.client.report [None req-bd56da70-cd1c-4aff-bd49-83fd5fed980f None None] [req-3759a850-a3ed-492f-b393-74dfd0acc819] Created resource provider record via placement API for resource provider with UUID a29934a0-6a74-4b6e-8edf-44d7a53db1dc and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 582.928880] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5800b450-1693-4a8e-ae14-3e28f7ec1630 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 582.936228] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c131238e-4012-4cc9-9afd-26d357f72239 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 582.965344] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-381cd82d-668a-42f3-bd0f-a82c6ad928bd {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 582.971976] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffb3710a-b94c-4c17-a9b9-bbcda24c2a42 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 582.984668] env[60400]: DEBUG nova.compute.provider_tree [None req-bd56da70-cd1c-4aff-bd49-83fd5fed980f None None] Updating inventory in ProviderTree for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 583.017961] env[60400]: DEBUG nova.scheduler.client.report [None req-bd56da70-cd1c-4aff-bd49-83fd5fed980f None None] Updated inventory for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 583.018186] env[60400]: DEBUG nova.compute.provider_tree [None req-bd56da70-cd1c-4aff-bd49-83fd5fed980f None None] Updating resource provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc generation from 0 to 1 during operation: update_inventory {{(pid=60400) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 583.018322] env[60400]: DEBUG nova.compute.provider_tree [None req-bd56da70-cd1c-4aff-bd49-83fd5fed980f None None] Updating inventory in ProviderTree for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 583.058976] env[60400]: DEBUG nova.compute.provider_tree [None req-bd56da70-cd1c-4aff-bd49-83fd5fed980f None None] Updating resource provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc generation from 1 to 2 during operation: update_traits {{(pid=60400) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 583.075824] env[60400]: DEBUG nova.compute.resource_tracker [None req-bd56da70-cd1c-4aff-bd49-83fd5fed980f None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60400) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 583.075978] env[60400]: DEBUG oslo_concurrency.lockutils [None req-bd56da70-cd1c-4aff-bd49-83fd5fed980f None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.331s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 583.076147] env[60400]: DEBUG nova.service [None req-bd56da70-cd1c-4aff-bd49-83fd5fed980f None None] Creating RPC server for service compute {{(pid=60400) start /opt/stack/nova/nova/service.py:182}} [ 583.094616] env[60400]: DEBUG nova.service [None req-bd56da70-cd1c-4aff-bd49-83fd5fed980f None None] Join ServiceGroup membership for this service compute {{(pid=60400) start /opt/stack/nova/nova/service.py:199}} [ 583.094803] env[60400]: DEBUG nova.servicegroup.drivers.db [None req-bd56da70-cd1c-4aff-bd49-83fd5fed980f None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=60400) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 587.097694] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._sync_power_states {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 587.108869] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Getting list of instances from cluster (obj){ [ 587.108869] env[60400]: value = "domain-c8" [ 587.108869] env[60400]: _type = "ClusterComputeResource" [ 587.108869] env[60400]: } {{(pid=60400) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 587.110047] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c3c9637-6151-41ab-97b4-76f001c81652 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 587.119733] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Got total of 0 instances {{(pid=60400) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 587.119948] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 587.120281] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Getting list of instances from cluster (obj){ [ 587.120281] env[60400]: value = "domain-c8" [ 587.120281] env[60400]: _type = "ClusterComputeResource" [ 587.120281] env[60400]: } {{(pid=60400) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 587.121395] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef226f7d-8ad4-4156-ba11-5f436fdf7ba1 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 587.129565] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Got total of 0 instances {{(pid=60400) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 625.395532] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Acquiring lock "148f525a-f3c0-40f2-8527-9607cd5e581b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 625.395865] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Lock "148f525a-f3c0-40f2-8527-9607cd5e581b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 625.423342] env[60400]: DEBUG nova.compute.manager [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Starting instance... {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 625.563421] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 625.563585] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 625.566498] env[60400]: INFO nova.compute.claims [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 625.719162] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd1a5812-9812-4e2c-8fe6-4a82b1125345 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.727349] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8ae198e-16f3-47ac-9fba-997b291efda5 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.766317] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e092bfd3-8746-4487-bba5-28f9f1e3dd38 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.774434] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2dc66f25-5405-47d3-9e27-10b88f63ae7e {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.790019] env[60400]: DEBUG nova.compute.provider_tree [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 625.805113] env[60400]: DEBUG nova.scheduler.client.report [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 625.848023] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.283s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 625.848023] env[60400]: DEBUG nova.compute.manager [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Start building networks asynchronously for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 625.899848] env[60400]: DEBUG nova.compute.utils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Using /dev/sd instead of None {{(pid=60400) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 625.904461] env[60400]: DEBUG nova.compute.manager [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Allocating IP information in the background. {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 625.904905] env[60400]: DEBUG nova.network.neutron [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] allocate_for_instance() {{(pid=60400) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 625.913965] env[60400]: DEBUG nova.compute.manager [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Start building block device mappings for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 626.009319] env[60400]: DEBUG nova.compute.manager [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Start spawning the instance on the hypervisor. {{(pid=60400) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 627.429075] env[60400]: DEBUG nova.virt.hardware [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T04:32:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T04:32:17Z,direct_url=,disk_format='vmdk',id=f5dfd970-7a56-4489-873c-2c3b6fbd9fe9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c82f07917ba4819a6bcf09e15f9f9cf',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T04:32:18Z,virtual_size=,visibility=), allow threads: False {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 627.429075] env[60400]: DEBUG nova.virt.hardware [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Flavor limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 627.429075] env[60400]: DEBUG nova.virt.hardware [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Image limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 627.429655] env[60400]: DEBUG nova.virt.hardware [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Flavor pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 627.429655] env[60400]: DEBUG nova.virt.hardware [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Image pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 627.429655] env[60400]: DEBUG nova.virt.hardware [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 627.429655] env[60400]: DEBUG nova.virt.hardware [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 627.429655] env[60400]: DEBUG nova.virt.hardware [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 627.430096] env[60400]: DEBUG nova.virt.hardware [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Got 1 possible topologies {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 627.430096] env[60400]: DEBUG nova.virt.hardware [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 627.430096] env[60400]: DEBUG nova.virt.hardware [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 627.431541] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d65e0bd6-2f23-4770-a149-6b8112ce27ee {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.442065] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2386cccc-d4a4-4b29-ae0b-dfaf2e2ea20d {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.463129] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91a1aaa8-9e10-47c4-bc40-549aa9eb3722 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.605044] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Acquiring lock "30c40353-01fe-407d-8d56-0f6c166d12e3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.605268] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Lock "30c40353-01fe-407d-8d56-0f6c166d12e3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.619324] env[60400]: DEBUG nova.compute.manager [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Starting instance... {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 627.688580] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.688796] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.690310] env[60400]: INFO nova.compute.claims [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 627.718841] env[60400]: DEBUG nova.policy [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c7081d000b0e4ff8ab58084f5d0f9f41', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd7324cc23ca541659f0f82bd61038ec5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60400) authorize /opt/stack/nova/nova/policy.py:203}} [ 627.818414] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7106b4df-79e0-40ed-a19a-3fe7a9ac8856 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.831051] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3318975d-80dc-43ec-b9e3-62ce8129403f {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.871088] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-313e804a-90eb-4310-96de-808e884ac4d4 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.881592] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52ca5106-8b89-4617-a5b4-b4f6d0c2d4c7 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.897178] env[60400]: DEBUG nova.compute.provider_tree [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 627.909224] env[60400]: DEBUG nova.scheduler.client.report [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 627.930654] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.242s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.931311] env[60400]: DEBUG nova.compute.manager [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Start building networks asynchronously for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 627.981727] env[60400]: DEBUG nova.compute.utils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Using /dev/sd instead of None {{(pid=60400) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 627.983553] env[60400]: DEBUG nova.compute.manager [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Allocating IP information in the background. {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 627.983779] env[60400]: DEBUG nova.network.neutron [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] allocate_for_instance() {{(pid=60400) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 627.998714] env[60400]: DEBUG nova.compute.manager [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Start building block device mappings for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 628.092479] env[60400]: DEBUG nova.compute.manager [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Start spawning the instance on the hypervisor. {{(pid=60400) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 628.105799] env[60400]: DEBUG nova.policy [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd23e0a3670f64e449edc8f3bfdee61c7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a7835207110449299e6f867f379be296', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60400) authorize /opt/stack/nova/nova/policy.py:203}} [ 628.125160] env[60400]: DEBUG nova.virt.hardware [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T04:32:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T04:32:17Z,direct_url=,disk_format='vmdk',id=f5dfd970-7a56-4489-873c-2c3b6fbd9fe9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c82f07917ba4819a6bcf09e15f9f9cf',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T04:32:18Z,virtual_size=,visibility=), allow threads: False {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 628.125160] env[60400]: DEBUG nova.virt.hardware [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Flavor limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 628.125381] env[60400]: DEBUG nova.virt.hardware [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Image limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 628.125485] env[60400]: DEBUG nova.virt.hardware [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Flavor pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 628.125619] env[60400]: DEBUG nova.virt.hardware [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Image pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 628.125753] env[60400]: DEBUG nova.virt.hardware [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 628.125944] env[60400]: DEBUG nova.virt.hardware [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 628.126118] env[60400]: DEBUG nova.virt.hardware [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 628.126278] env[60400]: DEBUG nova.virt.hardware [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Got 1 possible topologies {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 628.126430] env[60400]: DEBUG nova.virt.hardware [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 628.126589] env[60400]: DEBUG nova.virt.hardware [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 628.127771] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c6bcc9f-6bc5-403f-98d9-ec705bf378bb {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.143904] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22e3e3dc-51a6-43f9-a5e9-ab31e44df6a8 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.282490] env[60400]: DEBUG nova.network.neutron [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Successfully created port: ef02e80e-5a32-4117-a33d-353d6f7bd53a {{(pid=60400) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 628.909729] env[60400]: DEBUG nova.network.neutron [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Successfully created port: de57e989-a8d1-4474-ba2a-45abd1d9c209 {{(pid=60400) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 629.450826] env[60400]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Acquiring lock "130961ce-1e22-4320-abc9-30fc5f652be3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 629.451022] env[60400]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Lock "130961ce-1e22-4320-abc9-30fc5f652be3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 629.470173] env[60400]: DEBUG nova.compute.manager [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Starting instance... {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 629.536284] env[60400]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 629.536284] env[60400]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 629.536455] env[60400]: INFO nova.compute.claims [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 629.685924] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b08ae536-c9d7-4b6c-84fb-2639aee99db9 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 629.697168] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c6b513e-e2f0-45cd-baba-97608c7025d2 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 629.734698] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53493837-c933-4cdc-b8eb-6c754d183c91 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 629.743757] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e19022b-9ae1-4607-90bf-c1663dbe9b7c {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 629.766326] env[60400]: DEBUG nova.compute.provider_tree [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 629.780631] env[60400]: DEBUG nova.scheduler.client.report [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 629.801531] env[60400]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.265s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 629.801531] env[60400]: DEBUG nova.compute.manager [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Start building networks asynchronously for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 629.843913] env[60400]: DEBUG nova.compute.utils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Using /dev/sd instead of None {{(pid=60400) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 629.849315] env[60400]: DEBUG nova.compute.manager [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Allocating IP information in the background. {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 629.850089] env[60400]: DEBUG nova.network.neutron [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] allocate_for_instance() {{(pid=60400) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 629.862556] env[60400]: DEBUG nova.compute.manager [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Start building block device mappings for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 629.938347] env[60400]: DEBUG nova.compute.manager [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Start spawning the instance on the hypervisor. {{(pid=60400) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 629.976401] env[60400]: DEBUG nova.virt.hardware [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T04:32:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T04:32:17Z,direct_url=,disk_format='vmdk',id=f5dfd970-7a56-4489-873c-2c3b6fbd9fe9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c82f07917ba4819a6bcf09e15f9f9cf',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T04:32:18Z,virtual_size=,visibility=), allow threads: False {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 629.976644] env[60400]: DEBUG nova.virt.hardware [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Flavor limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 629.976799] env[60400]: DEBUG nova.virt.hardware [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Image limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 629.977058] env[60400]: DEBUG nova.virt.hardware [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Flavor pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 629.977192] env[60400]: DEBUG nova.virt.hardware [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Image pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 629.977314] env[60400]: DEBUG nova.virt.hardware [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 629.977536] env[60400]: DEBUG nova.virt.hardware [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 629.977868] env[60400]: DEBUG nova.virt.hardware [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 629.977868] env[60400]: DEBUG nova.virt.hardware [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Got 1 possible topologies {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 629.978019] env[60400]: DEBUG nova.virt.hardware [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 629.978193] env[60400]: DEBUG nova.virt.hardware [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 629.979385] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6dc65313-600a-41d8-9f7f-52c4ada9bed0 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 629.994797] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25a7cd5b-8d7b-499a-9ca9-1039734603f8 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.184379] env[60400]: DEBUG nova.policy [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4d6907f55ce847c2908c56f082eb622a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '367973317efd4063b56c3f337ad62856', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60400) authorize /opt/stack/nova/nova/policy.py:203}} [ 631.221976] env[60400]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Acquiring lock "4540cd82-440c-41e3-8bfa-b384da6fc964" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.222274] env[60400]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Lock "4540cd82-440c-41e3-8bfa-b384da6fc964" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.240025] env[60400]: DEBUG nova.compute.manager [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Starting instance... {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 631.305492] env[60400]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.305714] env[60400]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.309050] env[60400]: INFO nova.compute.claims [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 631.384796] env[60400]: DEBUG nova.network.neutron [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Successfully updated port: ef02e80e-5a32-4117-a33d-353d6f7bd53a {{(pid=60400) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 631.403310] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Acquiring lock "refresh_cache-148f525a-f3c0-40f2-8527-9607cd5e581b" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 631.403310] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Acquired lock "refresh_cache-148f525a-f3c0-40f2-8527-9607cd5e581b" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 631.403310] env[60400]: DEBUG nova.network.neutron [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Building network info cache for instance {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} [ 631.441374] env[60400]: DEBUG nova.network.neutron [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Successfully created port: e3346b4f-9fac-4a32-9f5c-0bd441628fb1 {{(pid=60400) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 631.482190] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-106a5ba4-c977-455c-b919-a278db68ed08 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.493013] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d7912fc-0546-4de3-bd45-6ae57379c866 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.527951] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a4de6af-4057-4572-a060-986595845070 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.536375] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34de9660-9a98-44e8-9933-d130e67cd15d {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.553097] env[60400]: DEBUG nova.compute.provider_tree [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 631.566278] env[60400]: DEBUG nova.scheduler.client.report [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 631.596549] env[60400]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.291s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.597470] env[60400]: DEBUG nova.compute.manager [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Start building networks asynchronously for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 631.637165] env[60400]: DEBUG nova.compute.utils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Using /dev/sd instead of None {{(pid=60400) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 631.637165] env[60400]: DEBUG nova.compute.manager [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Allocating IP information in the background. {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 631.637165] env[60400]: DEBUG nova.network.neutron [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] allocate_for_instance() {{(pid=60400) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 631.651046] env[60400]: DEBUG nova.compute.manager [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Start building block device mappings for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 631.666199] env[60400]: DEBUG nova.network.neutron [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Instance cache missing network info. {{(pid=60400) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 631.731358] env[60400]: DEBUG nova.compute.manager [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Start spawning the instance on the hypervisor. {{(pid=60400) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 631.762110] env[60400]: DEBUG nova.virt.hardware [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T04:32:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T04:32:17Z,direct_url=,disk_format='vmdk',id=f5dfd970-7a56-4489-873c-2c3b6fbd9fe9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c82f07917ba4819a6bcf09e15f9f9cf',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T04:32:18Z,virtual_size=,visibility=), allow threads: False {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 631.762110] env[60400]: DEBUG nova.virt.hardware [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Flavor limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 631.762110] env[60400]: DEBUG nova.virt.hardware [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Image limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 631.762280] env[60400]: DEBUG nova.virt.hardware [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Flavor pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 631.762665] env[60400]: DEBUG nova.virt.hardware [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Image pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 631.762992] env[60400]: DEBUG nova.virt.hardware [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 631.763326] env[60400]: DEBUG nova.virt.hardware [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 631.763617] env[60400]: DEBUG nova.virt.hardware [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 631.763887] env[60400]: DEBUG nova.virt.hardware [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Got 1 possible topologies {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 631.764158] env[60400]: DEBUG nova.virt.hardware [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 631.764445] env[60400]: DEBUG nova.virt.hardware [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 631.765400] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4d6470c-38b9-4ce6-a231-ea98f30e7ed8 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.780789] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21bbf0cd-2d62-4c52-8752-a71aaa9c8d16 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.847663] env[60400]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Acquiring lock "a45f24ab-afe1-4ffd-a917-11b68a0b29ec" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.847916] env[60400]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Lock "a45f24ab-afe1-4ffd-a917-11b68a0b29ec" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.857700] env[60400]: DEBUG nova.compute.manager [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Starting instance... {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 631.910831] env[60400]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.911576] env[60400]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.913023] env[60400]: INFO nova.compute.claims [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 632.059449] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a0bd240-fc29-4dc3-a4e5-1242d93b2224 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.063775] env[60400]: DEBUG nova.policy [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '933d0db8f7d6467692158e28db97f69e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd62d59712e2f4a1db623289edd1f497a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60400) authorize /opt/stack/nova/nova/policy.py:203}} [ 632.070023] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90d241d1-e8de-418a-98e3-8b1a4637579f {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.103240] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9f1096e-9189-415a-bd8d-40ddb99a052a {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.113285] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b15a9884-636c-42ef-bb8f-067dcb42c780 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.128486] env[60400]: DEBUG nova.compute.provider_tree [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 632.140652] env[60400]: DEBUG nova.scheduler.client.report [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 632.158265] env[60400]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.246s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 632.162439] env[60400]: DEBUG nova.compute.manager [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Start building networks asynchronously for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 632.196671] env[60400]: DEBUG nova.compute.utils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Using /dev/sd instead of None {{(pid=60400) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 632.200384] env[60400]: DEBUG nova.compute.manager [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Allocating IP information in the background. {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 632.200584] env[60400]: DEBUG nova.network.neutron [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] allocate_for_instance() {{(pid=60400) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 632.209892] env[60400]: DEBUG nova.compute.manager [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Start building block device mappings for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 632.214664] env[60400]: DEBUG nova.network.neutron [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Successfully updated port: de57e989-a8d1-4474-ba2a-45abd1d9c209 {{(pid=60400) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 632.242858] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Acquiring lock "refresh_cache-30c40353-01fe-407d-8d56-0f6c166d12e3" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 632.243722] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Acquired lock "refresh_cache-30c40353-01fe-407d-8d56-0f6c166d12e3" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 632.245067] env[60400]: DEBUG nova.network.neutron [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Building network info cache for instance {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} [ 632.323991] env[60400]: DEBUG nova.compute.manager [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Start spawning the instance on the hypervisor. {{(pid=60400) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 632.337204] env[60400]: DEBUG nova.network.neutron [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Instance cache missing network info. {{(pid=60400) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 632.354764] env[60400]: DEBUG nova.virt.hardware [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T04:32:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T04:32:17Z,direct_url=,disk_format='vmdk',id=f5dfd970-7a56-4489-873c-2c3b6fbd9fe9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c82f07917ba4819a6bcf09e15f9f9cf',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T04:32:18Z,virtual_size=,visibility=), allow threads: False {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 632.355047] env[60400]: DEBUG nova.virt.hardware [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Flavor limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 632.355221] env[60400]: DEBUG nova.virt.hardware [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Image limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 632.355407] env[60400]: DEBUG nova.virt.hardware [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Flavor pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 632.355551] env[60400]: DEBUG nova.virt.hardware [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Image pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 632.358259] env[60400]: DEBUG nova.virt.hardware [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 632.358512] env[60400]: DEBUG nova.virt.hardware [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 632.358680] env[60400]: DEBUG nova.virt.hardware [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 632.358847] env[60400]: DEBUG nova.virt.hardware [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Got 1 possible topologies {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 632.359016] env[60400]: DEBUG nova.virt.hardware [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 632.359188] env[60400]: DEBUG nova.virt.hardware [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 632.360417] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82fca94a-1cf0-4834-97c4-aae0ae161158 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.375637] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3dfe02ac-3236-4de2-b02f-cb16f8104237 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.844408] env[60400]: DEBUG nova.policy [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0768a2b288954d3fa18c861d45c577d5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f8c98a74b81e4f658a4a0ee9e97d8ab0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60400) authorize /opt/stack/nova/nova/policy.py:203}} [ 633.035956] env[60400]: DEBUG nova.network.neutron [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Updating instance_info_cache with network_info: [{"id": "ef02e80e-5a32-4117-a33d-353d6f7bd53a", "address": "fa:16:3e:52:06:62", "network": {"id": "ed0c4de2-e657-438c-b400-baf260923f2a", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.240", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c82f07917ba4819a6bcf09e15f9f9cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d054505-89d3-49c5-8b38-5da917a42c49", "external-id": "nsx-vlan-transportzone-888", "segmentation_id": 888, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapef02e80e-5a", "ovs_interfaceid": "ef02e80e-5a32-4117-a33d-353d6f7bd53a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 633.059036] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Releasing lock "refresh_cache-148f525a-f3c0-40f2-8527-9607cd5e581b" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 633.059036] env[60400]: DEBUG nova.compute.manager [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Instance network_info: |[{"id": "ef02e80e-5a32-4117-a33d-353d6f7bd53a", "address": "fa:16:3e:52:06:62", "network": {"id": "ed0c4de2-e657-438c-b400-baf260923f2a", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.240", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c82f07917ba4819a6bcf09e15f9f9cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d054505-89d3-49c5-8b38-5da917a42c49", "external-id": "nsx-vlan-transportzone-888", "segmentation_id": 888, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapef02e80e-5a", "ovs_interfaceid": "ef02e80e-5a32-4117-a33d-353d6f7bd53a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 633.059366] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:52:06:62', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6d054505-89d3-49c5-8b38-5da917a42c49', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ef02e80e-5a32-4117-a33d-353d6f7bd53a', 'vif_model': 'vmxnet3'}] {{(pid=60400) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 633.074599] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 633.076098] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1de834df-41d7-4af7-ba07-0d0ad37757a6 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.094250] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Created folder: OpenStack in parent group-v4. [ 633.094379] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Creating folder: Project (d7324cc23ca541659f0f82bd61038ec5). Parent ref: group-v119075. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 633.097497] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-dd67064f-9ad1-47dd-8864-010d04cec972 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.104897] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Created folder: Project (d7324cc23ca541659f0f82bd61038ec5) in parent group-v119075. [ 633.105103] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Creating folder: Instances. Parent ref: group-v119076. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 633.105330] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f61beb36-efb1-462e-8d37-1b90f38aa5fe {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.117724] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Created folder: Instances in parent group-v119076. [ 633.118053] env[60400]: DEBUG oslo.service.loopingcall [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60400) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 633.118291] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Creating VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 633.118466] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-87a69eb8-0c14-4333-80a7-747f0a6a8925 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.140276] env[60400]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 633.140276] env[60400]: value = "task-449752" [ 633.140276] env[60400]: _type = "Task" [ 633.140276] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 633.148922] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449752, 'name': CreateVM_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 633.253249] env[60400]: DEBUG nova.network.neutron [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Updating instance_info_cache with network_info: [{"id": "de57e989-a8d1-4474-ba2a-45abd1d9c209", "address": "fa:16:3e:24:67:ec", "network": {"id": "ed0c4de2-e657-438c-b400-baf260923f2a", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c82f07917ba4819a6bcf09e15f9f9cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d054505-89d3-49c5-8b38-5da917a42c49", "external-id": "nsx-vlan-transportzone-888", "segmentation_id": 888, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapde57e989-a8", "ovs_interfaceid": "de57e989-a8d1-4474-ba2a-45abd1d9c209", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 633.273033] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Releasing lock "refresh_cache-30c40353-01fe-407d-8d56-0f6c166d12e3" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 633.273033] env[60400]: DEBUG nova.compute.manager [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Instance network_info: |[{"id": "de57e989-a8d1-4474-ba2a-45abd1d9c209", "address": "fa:16:3e:24:67:ec", "network": {"id": "ed0c4de2-e657-438c-b400-baf260923f2a", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c82f07917ba4819a6bcf09e15f9f9cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d054505-89d3-49c5-8b38-5da917a42c49", "external-id": "nsx-vlan-transportzone-888", "segmentation_id": 888, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapde57e989-a8", "ovs_interfaceid": "de57e989-a8d1-4474-ba2a-45abd1d9c209", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 633.273251] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:24:67:ec', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6d054505-89d3-49c5-8b38-5da917a42c49', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'de57e989-a8d1-4474-ba2a-45abd1d9c209', 'vif_model': 'vmxnet3'}] {{(pid=60400) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 633.282628] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Creating folder: Project (a7835207110449299e6f867f379be296). Parent ref: group-v119075. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 633.283291] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f12eed0c-7103-420d-b2b6-26a4f5b4877a {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.295225] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Created folder: Project (a7835207110449299e6f867f379be296) in parent group-v119075. [ 633.295225] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Creating folder: Instances. Parent ref: group-v119079. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 633.295225] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-94209fa8-b2f5-421f-a30a-fe593d1288c3 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.305772] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Created folder: Instances in parent group-v119079. [ 633.305772] env[60400]: DEBUG oslo.service.loopingcall [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60400) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 633.305772] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Creating VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 633.306167] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-4563d5dd-8e62-4d53-ad52-b39fda3048a9 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.328290] env[60400]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 633.328290] env[60400]: value = "task-449755" [ 633.328290] env[60400]: _type = "Task" [ 633.328290] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 633.336579] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449755, 'name': CreateVM_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 633.653188] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449752, 'name': CreateVM_Task, 'duration_secs': 0.43631} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 633.653756] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Created VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 633.670368] env[60400]: DEBUG oslo_vmware.service [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59637513-57ee-4c94-b4b8-04ac8558ff11 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.676654] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 633.676819] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 633.677482] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 633.677812] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d5bf1388-c86c-4eef-9548-451991aa14ca {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.682515] env[60400]: DEBUG oslo_vmware.api [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Waiting for the task: (returnval){ [ 633.682515] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]527bb5f0-4da2-8454-2e0a-1f3eed62ce5c" [ 633.682515] env[60400]: _type = "Task" [ 633.682515] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 633.690247] env[60400]: DEBUG oslo_vmware.api [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]527bb5f0-4da2-8454-2e0a-1f3eed62ce5c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 633.757073] env[60400]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Acquiring lock "65bf8cf0-825c-42d8-bd78-62a6277d29d7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 633.757710] env[60400]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Lock "65bf8cf0-825c-42d8-bd78-62a6277d29d7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 633.772909] env[60400]: DEBUG nova.compute.manager [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Starting instance... {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 633.835449] env[60400]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 633.835717] env[60400]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 633.837206] env[60400]: INFO nova.compute.claims [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 633.844975] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449755, 'name': CreateVM_Task, 'duration_secs': 0.31474} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 633.844975] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Created VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 633.845319] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 633.994243] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-419accf9-25e8-47c1-88ad-6e69ee57357d {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.005313] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09b357c7-cc0b-497e-a700-6bfa8f4b9f50 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.038424] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62ce573e-db22-4725-b70d-01a2968170dd {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.046981] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a43515d4-1044-42b7-a5d3-50850cb98539 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.065023] env[60400]: DEBUG nova.compute.provider_tree [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 634.076805] env[60400]: DEBUG nova.scheduler.client.report [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 634.093874] env[60400]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.258s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 634.094494] env[60400]: DEBUG nova.compute.manager [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Start building networks asynchronously for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 634.140360] env[60400]: DEBUG nova.compute.utils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Using /dev/sd instead of None {{(pid=60400) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 634.141589] env[60400]: DEBUG nova.compute.manager [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Allocating IP information in the background. {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 634.142200] env[60400]: DEBUG nova.network.neutron [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] allocate_for_instance() {{(pid=60400) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 634.162447] env[60400]: DEBUG nova.compute.manager [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Start building block device mappings for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 634.195941] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 634.196239] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Processing image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 634.196473] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 634.196614] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 634.199518] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 634.199518] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 634.199518] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 634.199518] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d07ede46-6305-4e28-9876-281a49eb95cb {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.201979] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-03a8e14d-6c2c-490b-8355-112932c05e92 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.222685] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 634.222865] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60400) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 634.225092] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12f82a98-d24c-4b9b-8205-a31690c51393 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.228307] env[60400]: DEBUG oslo_vmware.api [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Waiting for the task: (returnval){ [ 634.228307] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]5250a540-fd33-db88-b24c-ac40392f7506" [ 634.228307] env[60400]: _type = "Task" [ 634.228307] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 634.239505] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b495247b-83ee-4558-b521-8b4196dcb861 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.245957] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 634.246353] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Processing image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 634.246495] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 634.248727] env[60400]: DEBUG oslo_vmware.api [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Waiting for the task: (returnval){ [ 634.248727] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52d4b52b-a446-7040-7053-ed2b743f0b40" [ 634.248727] env[60400]: _type = "Task" [ 634.248727] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 634.260040] env[60400]: DEBUG oslo_vmware.api [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52d4b52b-a446-7040-7053-ed2b743f0b40, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 634.267871] env[60400]: DEBUG nova.compute.manager [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Start spawning the instance on the hypervisor. {{(pid=60400) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 634.297757] env[60400]: DEBUG nova.virt.hardware [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T04:32:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T04:32:17Z,direct_url=,disk_format='vmdk',id=f5dfd970-7a56-4489-873c-2c3b6fbd9fe9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c82f07917ba4819a6bcf09e15f9f9cf',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T04:32:18Z,virtual_size=,visibility=), allow threads: False {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 634.298609] env[60400]: DEBUG nova.virt.hardware [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Flavor limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 634.298609] env[60400]: DEBUG nova.virt.hardware [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Image limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 634.298609] env[60400]: DEBUG nova.virt.hardware [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Flavor pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 634.298609] env[60400]: DEBUG nova.virt.hardware [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Image pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 634.298775] env[60400]: DEBUG nova.virt.hardware [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 634.298881] env[60400]: DEBUG nova.virt.hardware [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 634.299023] env[60400]: DEBUG nova.virt.hardware [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 634.299702] env[60400]: DEBUG nova.virt.hardware [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Got 1 possible topologies {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 634.299873] env[60400]: DEBUG nova.virt.hardware [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 634.301138] env[60400]: DEBUG nova.virt.hardware [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 634.301138] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83bb11b0-b713-452f-aa37-64669a2db3ed {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.312904] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98b11ce1-e153-499e-ad64-55caa9596752 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.364240] env[60400]: DEBUG nova.network.neutron [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Successfully updated port: e3346b4f-9fac-4a32-9f5c-0bd441628fb1 {{(pid=60400) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 634.377069] env[60400]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Acquiring lock "refresh_cache-130961ce-1e22-4320-abc9-30fc5f652be3" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 634.377225] env[60400]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Acquired lock "refresh_cache-130961ce-1e22-4320-abc9-30fc5f652be3" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 634.377555] env[60400]: DEBUG nova.network.neutron [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Building network info cache for instance {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} [ 634.402599] env[60400]: DEBUG nova.network.neutron [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Successfully created port: ba27078a-b766-43c9-a18e-6eb75ada4eeb {{(pid=60400) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 634.411066] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Acquiring lock "cc1d534d-6a43-4575-895d-c3bef84d772e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 634.411279] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Lock "cc1d534d-6a43-4575-895d-c3bef84d772e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 634.421211] env[60400]: DEBUG nova.compute.manager [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Starting instance... {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 634.476420] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 634.477915] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 634.478598] env[60400]: INFO nova.compute.claims [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 634.526336] env[60400]: DEBUG nova.network.neutron [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Instance cache missing network info. {{(pid=60400) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 634.692563] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00342e25-1d8a-45c7-b5b7-6fc43a6274f9 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.703932] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f34cfbc2-d231-4b6f-a1bf-f402f4b46b56 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.710785] env[60400]: DEBUG nova.policy [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fde1ca8282c8470092a272f43a40b55d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6e5c7d3736204e8eafce9963fa2a28eb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60400) authorize /opt/stack/nova/nova/policy.py:203}} [ 634.740408] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e4f7b24-9940-4097-b4d3-16e7835eca4c {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.744034] env[60400]: DEBUG nova.compute.manager [req-569062cd-2db8-4a9e-ba17-c3ba334ee35d req-59766112-7638-4ec8-b03b-f869edeb9ca6 service nova] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Received event network-vif-plugged-ef02e80e-5a32-4117-a33d-353d6f7bd53a {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 634.744239] env[60400]: DEBUG oslo_concurrency.lockutils [req-569062cd-2db8-4a9e-ba17-c3ba334ee35d req-59766112-7638-4ec8-b03b-f869edeb9ca6 service nova] Acquiring lock "148f525a-f3c0-40f2-8527-9607cd5e581b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 634.744384] env[60400]: DEBUG oslo_concurrency.lockutils [req-569062cd-2db8-4a9e-ba17-c3ba334ee35d req-59766112-7638-4ec8-b03b-f869edeb9ca6 service nova] Lock "148f525a-f3c0-40f2-8527-9607cd5e581b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 634.744539] env[60400]: DEBUG oslo_concurrency.lockutils [req-569062cd-2db8-4a9e-ba17-c3ba334ee35d req-59766112-7638-4ec8-b03b-f869edeb9ca6 service nova] Lock "148f525a-f3c0-40f2-8527-9607cd5e581b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 634.744685] env[60400]: DEBUG nova.compute.manager [req-569062cd-2db8-4a9e-ba17-c3ba334ee35d req-59766112-7638-4ec8-b03b-f869edeb9ca6 service nova] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] No waiting events found dispatching network-vif-plugged-ef02e80e-5a32-4117-a33d-353d6f7bd53a {{(pid=60400) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 634.744862] env[60400]: WARNING nova.compute.manager [req-569062cd-2db8-4a9e-ba17-c3ba334ee35d req-59766112-7638-4ec8-b03b-f869edeb9ca6 service nova] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Received unexpected event network-vif-plugged-ef02e80e-5a32-4117-a33d-353d6f7bd53a for instance with vm_state building and task_state spawning. [ 634.757387] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83bd5c92-7917-48e1-a94c-da8127f87c74 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.777621] env[60400]: DEBUG nova.compute.provider_tree [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 634.779253] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Preparing fetch location {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 634.779595] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Creating directory with path [datastore1] vmware_temp/677a0c20-4765-4d88-8ade-b11df6169364/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 634.780177] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fa2693d7-178d-4a4f-987a-c8f33b677db5 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.791615] env[60400]: DEBUG nova.scheduler.client.report [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 634.802146] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Created directory with path [datastore1] vmware_temp/677a0c20-4765-4d88-8ade-b11df6169364/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 634.802146] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Fetch image to [datastore1] vmware_temp/677a0c20-4765-4d88-8ade-b11df6169364/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 634.802394] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to [datastore1] vmware_temp/677a0c20-4765-4d88-8ade-b11df6169364/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 634.803019] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81abab21-d2f6-4c93-bb04-569a27b1e30b {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.813357] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.337s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 634.813917] env[60400]: DEBUG nova.compute.manager [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Start building networks asynchronously for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 634.817225] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24f98ab6-3d56-4e02-ad8a-2cd76911d232 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.829021] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1c68ab9-dae0-417e-9de1-fa027820a604 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.864932] env[60400]: DEBUG nova.compute.utils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Using /dev/sd instead of None {{(pid=60400) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 634.866635] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75a45c57-9222-4844-97ea-31c7bf7e87ea {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.869674] env[60400]: DEBUG nova.compute.manager [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Allocating IP information in the background. {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 634.869843] env[60400]: DEBUG nova.network.neutron [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] allocate_for_instance() {{(pid=60400) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 634.875748] env[60400]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0e3ffcac-4c6d-4f06-94de-8120935ba1f6 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.878327] env[60400]: DEBUG nova.compute.manager [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Start building block device mappings for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 634.961515] env[60400]: DEBUG nova.compute.manager [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Start spawning the instance on the hypervisor. {{(pid=60400) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 634.964960] env[60400]: DEBUG nova.network.neutron [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Successfully created port: 4eca35f8-e2f3-4bf1-a56a-851182d59348 {{(pid=60400) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 634.971847] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 634.994353] env[60400]: DEBUG nova.virt.hardware [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T04:32:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T04:32:17Z,direct_url=,disk_format='vmdk',id=f5dfd970-7a56-4489-873c-2c3b6fbd9fe9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c82f07917ba4819a6bcf09e15f9f9cf',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T04:32:18Z,virtual_size=,visibility=), allow threads: False {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 634.994353] env[60400]: DEBUG nova.virt.hardware [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Flavor limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 634.994353] env[60400]: DEBUG nova.virt.hardware [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Image limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 634.994592] env[60400]: DEBUG nova.virt.hardware [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Flavor pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 634.994592] env[60400]: DEBUG nova.virt.hardware [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Image pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 634.994592] env[60400]: DEBUG nova.virt.hardware [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 634.994592] env[60400]: DEBUG nova.virt.hardware [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 634.994592] env[60400]: DEBUG nova.virt.hardware [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 634.994748] env[60400]: DEBUG nova.virt.hardware [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Got 1 possible topologies {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 634.994748] env[60400]: DEBUG nova.virt.hardware [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 634.994748] env[60400]: DEBUG nova.virt.hardware [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 634.999178] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3aa522d8-8092-4274-afcc-28eb0f2e8d85 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.006459] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3c73b02-5b84-4f3c-9150-32d64000fc0d {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.072228] env[60400]: DEBUG oslo_vmware.rw_handles [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/677a0c20-4765-4d88-8ade-b11df6169364/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60400) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 635.136220] env[60400]: DEBUG nova.compute.manager [req-e0548769-35fa-4532-8bc5-35976b0e641c req-ee93604c-a9dc-4551-92da-418ef5ecb93d service nova] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Received event network-vif-plugged-de57e989-a8d1-4474-ba2a-45abd1d9c209 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 635.136435] env[60400]: DEBUG oslo_concurrency.lockutils [req-e0548769-35fa-4532-8bc5-35976b0e641c req-ee93604c-a9dc-4551-92da-418ef5ecb93d service nova] Acquiring lock "30c40353-01fe-407d-8d56-0f6c166d12e3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 635.136746] env[60400]: DEBUG oslo_concurrency.lockutils [req-e0548769-35fa-4532-8bc5-35976b0e641c req-ee93604c-a9dc-4551-92da-418ef5ecb93d service nova] Lock "30c40353-01fe-407d-8d56-0f6c166d12e3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 635.136845] env[60400]: DEBUG oslo_concurrency.lockutils [req-e0548769-35fa-4532-8bc5-35976b0e641c req-ee93604c-a9dc-4551-92da-418ef5ecb93d service nova] Lock "30c40353-01fe-407d-8d56-0f6c166d12e3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 635.136994] env[60400]: DEBUG nova.compute.manager [req-e0548769-35fa-4532-8bc5-35976b0e641c req-ee93604c-a9dc-4551-92da-418ef5ecb93d service nova] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] No waiting events found dispatching network-vif-plugged-de57e989-a8d1-4474-ba2a-45abd1d9c209 {{(pid=60400) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 635.138460] env[60400]: WARNING nova.compute.manager [req-e0548769-35fa-4532-8bc5-35976b0e641c req-ee93604c-a9dc-4551-92da-418ef5ecb93d service nova] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Received unexpected event network-vif-plugged-de57e989-a8d1-4474-ba2a-45abd1d9c209 for instance with vm_state building and task_state spawning. [ 635.140187] env[60400]: DEBUG oslo_vmware.rw_handles [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Completed reading data from the image iterator. {{(pid=60400) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 635.140529] env[60400]: DEBUG oslo_vmware.rw_handles [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/677a0c20-4765-4d88-8ade-b11df6169364/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60400) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 635.183243] env[60400]: DEBUG nova.policy [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b3a409f2c61a48f784c6b761b1ff1309', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '69b9135b39df41b49fbd80c72a9cab5c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60400) authorize /opt/stack/nova/nova/policy.py:203}} [ 635.709029] env[60400]: DEBUG nova.network.neutron [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Updating instance_info_cache with network_info: [{"id": "e3346b4f-9fac-4a32-9f5c-0bd441628fb1", "address": "fa:16:3e:99:0c:3e", "network": {"id": "107b9535-b774-4def-b614-5b4cdda24022", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-272790240-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "367973317efd4063b56c3f337ad62856", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8c58d99d-ec12-4fc3-ab39-042b3f8cbb89", "external-id": "nsx-vlan-transportzone-44", "segmentation_id": 44, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape3346b4f-9f", "ovs_interfaceid": "e3346b4f-9fac-4a32-9f5c-0bd441628fb1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 635.725825] env[60400]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Releasing lock "refresh_cache-130961ce-1e22-4320-abc9-30fc5f652be3" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 635.728088] env[60400]: DEBUG nova.compute.manager [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Instance network_info: |[{"id": "e3346b4f-9fac-4a32-9f5c-0bd441628fb1", "address": "fa:16:3e:99:0c:3e", "network": {"id": "107b9535-b774-4def-b614-5b4cdda24022", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-272790240-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "367973317efd4063b56c3f337ad62856", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8c58d99d-ec12-4fc3-ab39-042b3f8cbb89", "external-id": "nsx-vlan-transportzone-44", "segmentation_id": 44, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape3346b4f-9f", "ovs_interfaceid": "e3346b4f-9fac-4a32-9f5c-0bd441628fb1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 635.728411] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:99:0c:3e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '8c58d99d-ec12-4fc3-ab39-042b3f8cbb89', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e3346b4f-9fac-4a32-9f5c-0bd441628fb1', 'vif_model': 'vmxnet3'}] {{(pid=60400) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 635.737044] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Creating folder: Project (367973317efd4063b56c3f337ad62856). Parent ref: group-v119075. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 635.738167] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-46d5c44a-8e73-4975-9354-4359ac190cd4 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.749798] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Created folder: Project (367973317efd4063b56c3f337ad62856) in parent group-v119075. [ 635.749980] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Creating folder: Instances. Parent ref: group-v119082. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 635.750216] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9805bd6b-651b-4218-a462-d1ca634e9071 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.761556] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Created folder: Instances in parent group-v119082. [ 635.761781] env[60400]: DEBUG oslo.service.loopingcall [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60400) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 635.761956] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Creating VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 635.762164] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-549ae853-e4da-401f-9778-91777e554597 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.790023] env[60400]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 635.790023] env[60400]: value = "task-449758" [ 635.790023] env[60400]: _type = "Task" [ 635.790023] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 635.798339] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449758, 'name': CreateVM_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 636.301438] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449758, 'name': CreateVM_Task, 'duration_secs': 0.28756} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 636.301631] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Created VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 636.302243] env[60400]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 636.302396] env[60400]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 636.302738] env[60400]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 636.303021] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3c1794a2-cc9d-4b64-b344-a186c4fe67f4 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.310791] env[60400]: DEBUG oslo_vmware.api [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Waiting for the task: (returnval){ [ 636.310791] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]5237e4b8-f595-ce63-efea-5329485ab9b9" [ 636.310791] env[60400]: _type = "Task" [ 636.310791] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 636.317142] env[60400]: DEBUG oslo_vmware.api [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]5237e4b8-f595-ce63-efea-5329485ab9b9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 636.825391] env[60400]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 636.825900] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Processing image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 636.826772] env[60400]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 637.082280] env[60400]: DEBUG nova.network.neutron [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Successfully created port: 29b6e5e1-0961-4c22-9cd8-d8a073552857 {{(pid=60400) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 637.170092] env[60400]: DEBUG nova.network.neutron [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Successfully created port: 83178c1c-b6a2-44c4-b05c-c995d7e267ff {{(pid=60400) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 637.261129] env[60400]: DEBUG nova.network.neutron [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Successfully updated port: ba27078a-b766-43c9-a18e-6eb75ada4eeb {{(pid=60400) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 637.277329] env[60400]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Acquiring lock "refresh_cache-4540cd82-440c-41e3-8bfa-b384da6fc964" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 637.277329] env[60400]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Acquired lock "refresh_cache-4540cd82-440c-41e3-8bfa-b384da6fc964" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 637.277329] env[60400]: DEBUG nova.network.neutron [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Building network info cache for instance {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} [ 637.597687] env[60400]: DEBUG nova.network.neutron [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Instance cache missing network info. {{(pid=60400) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 637.936920] env[60400]: DEBUG nova.network.neutron [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Successfully updated port: 4eca35f8-e2f3-4bf1-a56a-851182d59348 {{(pid=60400) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 637.958140] env[60400]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Acquiring lock "refresh_cache-a45f24ab-afe1-4ffd-a917-11b68a0b29ec" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 637.958140] env[60400]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Acquired lock "refresh_cache-a45f24ab-afe1-4ffd-a917-11b68a0b29ec" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 637.958140] env[60400]: DEBUG nova.network.neutron [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Building network info cache for instance {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} [ 638.090142] env[60400]: DEBUG nova.network.neutron [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Instance cache missing network info. {{(pid=60400) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 638.334469] env[60400]: DEBUG nova.network.neutron [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Updating instance_info_cache with network_info: [{"id": "ba27078a-b766-43c9-a18e-6eb75ada4eeb", "address": "fa:16:3e:63:9b:6a", "network": {"id": "ed0c4de2-e657-438c-b400-baf260923f2a", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.224", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c82f07917ba4819a6bcf09e15f9f9cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d054505-89d3-49c5-8b38-5da917a42c49", "external-id": "nsx-vlan-transportzone-888", "segmentation_id": 888, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapba27078a-b7", "ovs_interfaceid": "ba27078a-b766-43c9-a18e-6eb75ada4eeb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 638.359166] env[60400]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Releasing lock "refresh_cache-4540cd82-440c-41e3-8bfa-b384da6fc964" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 638.363009] env[60400]: DEBUG nova.compute.manager [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Instance network_info: |[{"id": "ba27078a-b766-43c9-a18e-6eb75ada4eeb", "address": "fa:16:3e:63:9b:6a", "network": {"id": "ed0c4de2-e657-438c-b400-baf260923f2a", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.224", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c82f07917ba4819a6bcf09e15f9f9cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d054505-89d3-49c5-8b38-5da917a42c49", "external-id": "nsx-vlan-transportzone-888", "segmentation_id": 888, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapba27078a-b7", "ovs_interfaceid": "ba27078a-b766-43c9-a18e-6eb75ada4eeb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 638.364398] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:63:9b:6a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6d054505-89d3-49c5-8b38-5da917a42c49', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ba27078a-b766-43c9-a18e-6eb75ada4eeb', 'vif_model': 'vmxnet3'}] {{(pid=60400) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 638.373476] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Creating folder: Project (d62d59712e2f4a1db623289edd1f497a). Parent ref: group-v119075. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 638.374255] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-dd0e117a-4db0-4354-9b30-4d0a2c8d1059 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.387745] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Created folder: Project (d62d59712e2f4a1db623289edd1f497a) in parent group-v119075. [ 638.387922] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Creating folder: Instances. Parent ref: group-v119085. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 638.391125] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4b9826a1-ada4-43ae-93dc-5ef3ea66d09c {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.399488] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Created folder: Instances in parent group-v119085. [ 638.399749] env[60400]: DEBUG oslo.service.loopingcall [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60400) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 638.399940] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Creating VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 638.400298] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-52c5fe76-0938-49a8-bded-a059c501ccb5 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.426139] env[60400]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 638.426139] env[60400]: value = "task-449761" [ 638.426139] env[60400]: _type = "Task" [ 638.426139] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 638.438794] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449761, 'name': CreateVM_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 638.941669] env[60400]: DEBUG nova.network.neutron [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Updating instance_info_cache with network_info: [{"id": "4eca35f8-e2f3-4bf1-a56a-851182d59348", "address": "fa:16:3e:bd:8c:05", "network": {"id": "ed0c4de2-e657-438c-b400-baf260923f2a", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.165", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c82f07917ba4819a6bcf09e15f9f9cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d054505-89d3-49c5-8b38-5da917a42c49", "external-id": "nsx-vlan-transportzone-888", "segmentation_id": 888, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4eca35f8-e2", "ovs_interfaceid": "4eca35f8-e2f3-4bf1-a56a-851182d59348", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 638.946791] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 638.948162] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449761, 'name': CreateVM_Task, 'duration_secs': 0.41037} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 638.948651] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 638.949536] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Starting heal instance info cache {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 638.949713] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Rebuilding the list of instances to heal {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 638.950983] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Created VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 638.952344] env[60400]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 638.952344] env[60400]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 638.952703] env[60400]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 638.952966] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9c886c44-67b5-4da8-9f62-faa22095ea3a {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.964020] env[60400]: DEBUG oslo_vmware.api [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Waiting for the task: (returnval){ [ 638.964020] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]5253eab4-f9d9-6ccc-8dfc-a09736adbad9" [ 638.964020] env[60400]: _type = "Task" [ 638.964020] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 638.969749] env[60400]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Releasing lock "refresh_cache-a45f24ab-afe1-4ffd-a917-11b68a0b29ec" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 638.970079] env[60400]: DEBUG nova.compute.manager [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Instance network_info: |[{"id": "4eca35f8-e2f3-4bf1-a56a-851182d59348", "address": "fa:16:3e:bd:8c:05", "network": {"id": "ed0c4de2-e657-438c-b400-baf260923f2a", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.165", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c82f07917ba4819a6bcf09e15f9f9cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d054505-89d3-49c5-8b38-5da917a42c49", "external-id": "nsx-vlan-transportzone-888", "segmentation_id": 888, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4eca35f8-e2", "ovs_interfaceid": "4eca35f8-e2f3-4bf1-a56a-851182d59348", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 638.971364] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:bd:8c:05', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6d054505-89d3-49c5-8b38-5da917a42c49', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '4eca35f8-e2f3-4bf1-a56a-851182d59348', 'vif_model': 'vmxnet3'}] {{(pid=60400) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 638.984621] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Creating folder: Project (f8c98a74b81e4f658a4a0ee9e97d8ab0). Parent ref: group-v119075. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 638.992099] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b4a15f08-6c6d-47d3-9c09-a6888b44df39 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.995954] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 638.996270] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 638.996419] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 638.996623] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 638.996758] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 638.996939] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 638.997113] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 638.997319] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Didn't find any instances for network info cache update. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 638.997920] env[60400]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 638.998181] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Processing image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 638.998414] env[60400]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 638.998772] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 639.002366] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 639.002366] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 639.002366] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 639.002366] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 639.002366] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 639.002366] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60400) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 639.002680] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 639.008273] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Created folder: Project (f8c98a74b81e4f658a4a0ee9e97d8ab0) in parent group-v119075. [ 639.008453] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Creating folder: Instances. Parent ref: group-v119088. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 639.008670] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b62b8cb1-9f8f-42d9-9798-2b5b53337006 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.021331] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Created folder: Instances in parent group-v119088. [ 639.021810] env[60400]: DEBUG oslo.service.loopingcall [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60400) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 639.023182] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Creating VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 639.024209] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 639.025193] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 639.028138] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 639.028138] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60400) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 639.028138] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0cba552f-b081-4647-94a9-d421199a0155 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.051236] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3d46339-21d6-49b0-acde-de0730023b3a {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.060139] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cc14167-651b-4921-afa4-b959ccfdaf90 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.067735] env[60400]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 639.067735] env[60400]: value = "task-449764" [ 639.067735] env[60400]: _type = "Task" [ 639.067735] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 639.083594] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97f00e11-9ed7-4b54-9e21-c398ea1da137 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.089293] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449764, 'name': CreateVM_Task} progress is 10%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 639.095084] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a4388b6-8c5e-48e6-8567-aac449b96d4b {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.130148] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181812MB free_disk=118GB free_vcpus=48 pci_devices=None {{(pid=60400) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 639.130315] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 639.130518] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 639.229536] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 148f525a-f3c0-40f2-8527-9607cd5e581b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 639.229722] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 30c40353-01fe-407d-8d56-0f6c166d12e3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 639.229883] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 130961ce-1e22-4320-abc9-30fc5f652be3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 639.230020] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 4540cd82-440c-41e3-8bfa-b384da6fc964 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 639.230144] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance a45f24ab-afe1-4ffd-a917-11b68a0b29ec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 639.230249] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 65bf8cf0-825c-42d8-bd78-62a6277d29d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 639.230352] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance cc1d534d-6a43-4575-895d-c3bef84d772e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 639.230538] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=60400) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 639.230665] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=60400) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 639.394116] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63247005-d8cd-4007-a588-d371dbdc5da9 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.402776] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9175be7f-67e0-4b9e-84ba-d18d0f6054c7 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.442675] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f95c5144-b65d-45ec-80e4-a2703789422d {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.452155] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98dfcd66-24d1-4e13-85cc-953cf7111a77 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.475210] env[60400]: DEBUG nova.compute.provider_tree [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 639.495663] env[60400]: DEBUG nova.scheduler.client.report [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 639.523578] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60400) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 639.523578] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.392s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 639.542529] env[60400]: DEBUG nova.compute.manager [req-359c9fb7-5780-49c2-b4aa-ee5026d36c0a req-586d3828-71e0-48f9-bb34-44263eddec53 service nova] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Received event network-changed-ef02e80e-5a32-4117-a33d-353d6f7bd53a {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 639.542626] env[60400]: DEBUG nova.compute.manager [req-359c9fb7-5780-49c2-b4aa-ee5026d36c0a req-586d3828-71e0-48f9-bb34-44263eddec53 service nova] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Refreshing instance network info cache due to event network-changed-ef02e80e-5a32-4117-a33d-353d6f7bd53a. {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 639.542831] env[60400]: DEBUG oslo_concurrency.lockutils [req-359c9fb7-5780-49c2-b4aa-ee5026d36c0a req-586d3828-71e0-48f9-bb34-44263eddec53 service nova] Acquiring lock "refresh_cache-148f525a-f3c0-40f2-8527-9607cd5e581b" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 639.543074] env[60400]: DEBUG oslo_concurrency.lockutils [req-359c9fb7-5780-49c2-b4aa-ee5026d36c0a req-586d3828-71e0-48f9-bb34-44263eddec53 service nova] Acquired lock "refresh_cache-148f525a-f3c0-40f2-8527-9607cd5e581b" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 639.543843] env[60400]: DEBUG nova.network.neutron [req-359c9fb7-5780-49c2-b4aa-ee5026d36c0a req-586d3828-71e0-48f9-bb34-44263eddec53 service nova] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Refreshing network info cache for port ef02e80e-5a32-4117-a33d-353d6f7bd53a {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} [ 639.584613] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449764, 'name': CreateVM_Task, 'duration_secs': 0.309871} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 639.584817] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Created VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 639.585668] env[60400]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 639.585668] env[60400]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 639.586250] env[60400]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 639.586562] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0bebbfee-d611-4feb-b84b-7faa486d53d8 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.594536] env[60400]: DEBUG oslo_vmware.api [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Waiting for the task: (returnval){ [ 639.594536] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]522d23b9-632e-ae57-9d08-c424258cad21" [ 639.594536] env[60400]: _type = "Task" [ 639.594536] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 639.605364] env[60400]: DEBUG oslo_vmware.api [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]522d23b9-632e-ae57-9d08-c424258cad21, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 639.897732] env[60400]: DEBUG nova.network.neutron [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Successfully updated port: 29b6e5e1-0961-4c22-9cd8-d8a073552857 {{(pid=60400) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 639.923950] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Acquiring lock "refresh_cache-cc1d534d-6a43-4575-895d-c3bef84d772e" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 639.924261] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Acquired lock "refresh_cache-cc1d534d-6a43-4575-895d-c3bef84d772e" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 639.924504] env[60400]: DEBUG nova.network.neutron [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Building network info cache for instance {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} [ 640.048726] env[60400]: DEBUG nova.network.neutron [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Instance cache missing network info. {{(pid=60400) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 640.107876] env[60400]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 640.108146] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Processing image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 640.108353] env[60400]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 640.165074] env[60400]: DEBUG nova.network.neutron [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Successfully updated port: 83178c1c-b6a2-44c4-b05c-c995d7e267ff {{(pid=60400) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 640.179203] env[60400]: DEBUG nova.compute.manager [req-ddea34ae-5c8b-47a0-a1e1-9c5b7f341f3b req-9a05312c-c99f-4bc9-a85e-34e1917074e9 service nova] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Received event network-changed-de57e989-a8d1-4474-ba2a-45abd1d9c209 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 640.179385] env[60400]: DEBUG nova.compute.manager [req-ddea34ae-5c8b-47a0-a1e1-9c5b7f341f3b req-9a05312c-c99f-4bc9-a85e-34e1917074e9 service nova] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Refreshing instance network info cache due to event network-changed-de57e989-a8d1-4474-ba2a-45abd1d9c209. {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 640.179623] env[60400]: DEBUG oslo_concurrency.lockutils [req-ddea34ae-5c8b-47a0-a1e1-9c5b7f341f3b req-9a05312c-c99f-4bc9-a85e-34e1917074e9 service nova] Acquiring lock "refresh_cache-30c40353-01fe-407d-8d56-0f6c166d12e3" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 640.179759] env[60400]: DEBUG oslo_concurrency.lockutils [req-ddea34ae-5c8b-47a0-a1e1-9c5b7f341f3b req-9a05312c-c99f-4bc9-a85e-34e1917074e9 service nova] Acquired lock "refresh_cache-30c40353-01fe-407d-8d56-0f6c166d12e3" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 640.179918] env[60400]: DEBUG nova.network.neutron [req-ddea34ae-5c8b-47a0-a1e1-9c5b7f341f3b req-9a05312c-c99f-4bc9-a85e-34e1917074e9 service nova] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Refreshing network info cache for port de57e989-a8d1-4474-ba2a-45abd1d9c209 {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} [ 640.187856] env[60400]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Acquiring lock "refresh_cache-65bf8cf0-825c-42d8-bd78-62a6277d29d7" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 640.188007] env[60400]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Acquired lock "refresh_cache-65bf8cf0-825c-42d8-bd78-62a6277d29d7" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 640.188160] env[60400]: DEBUG nova.network.neutron [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Building network info cache for instance {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} [ 640.326784] env[60400]: DEBUG nova.network.neutron [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Instance cache missing network info. {{(pid=60400) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 640.585555] env[60400]: DEBUG nova.network.neutron [req-359c9fb7-5780-49c2-b4aa-ee5026d36c0a req-586d3828-71e0-48f9-bb34-44263eddec53 service nova] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Updated VIF entry in instance network info cache for port ef02e80e-5a32-4117-a33d-353d6f7bd53a. {{(pid=60400) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} [ 640.585910] env[60400]: DEBUG nova.network.neutron [req-359c9fb7-5780-49c2-b4aa-ee5026d36c0a req-586d3828-71e0-48f9-bb34-44263eddec53 service nova] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Updating instance_info_cache with network_info: [{"id": "ef02e80e-5a32-4117-a33d-353d6f7bd53a", "address": "fa:16:3e:52:06:62", "network": {"id": "ed0c4de2-e657-438c-b400-baf260923f2a", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.240", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c82f07917ba4819a6bcf09e15f9f9cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d054505-89d3-49c5-8b38-5da917a42c49", "external-id": "nsx-vlan-transportzone-888", "segmentation_id": 888, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapef02e80e-5a", "ovs_interfaceid": "ef02e80e-5a32-4117-a33d-353d6f7bd53a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 640.598831] env[60400]: DEBUG oslo_concurrency.lockutils [req-359c9fb7-5780-49c2-b4aa-ee5026d36c0a req-586d3828-71e0-48f9-bb34-44263eddec53 service nova] Releasing lock "refresh_cache-148f525a-f3c0-40f2-8527-9607cd5e581b" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 640.599085] env[60400]: DEBUG nova.compute.manager [req-359c9fb7-5780-49c2-b4aa-ee5026d36c0a req-586d3828-71e0-48f9-bb34-44263eddec53 service nova] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Received event network-vif-plugged-ba27078a-b766-43c9-a18e-6eb75ada4eeb {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 640.599274] env[60400]: DEBUG oslo_concurrency.lockutils [req-359c9fb7-5780-49c2-b4aa-ee5026d36c0a req-586d3828-71e0-48f9-bb34-44263eddec53 service nova] Acquiring lock "4540cd82-440c-41e3-8bfa-b384da6fc964-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 640.599462] env[60400]: DEBUG oslo_concurrency.lockutils [req-359c9fb7-5780-49c2-b4aa-ee5026d36c0a req-586d3828-71e0-48f9-bb34-44263eddec53 service nova] Lock "4540cd82-440c-41e3-8bfa-b384da6fc964-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 640.599625] env[60400]: DEBUG oslo_concurrency.lockutils [req-359c9fb7-5780-49c2-b4aa-ee5026d36c0a req-586d3828-71e0-48f9-bb34-44263eddec53 service nova] Lock "4540cd82-440c-41e3-8bfa-b384da6fc964-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 640.599777] env[60400]: DEBUG nova.compute.manager [req-359c9fb7-5780-49c2-b4aa-ee5026d36c0a req-586d3828-71e0-48f9-bb34-44263eddec53 service nova] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] No waiting events found dispatching network-vif-plugged-ba27078a-b766-43c9-a18e-6eb75ada4eeb {{(pid=60400) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 640.599931] env[60400]: WARNING nova.compute.manager [req-359c9fb7-5780-49c2-b4aa-ee5026d36c0a req-586d3828-71e0-48f9-bb34-44263eddec53 service nova] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Received unexpected event network-vif-plugged-ba27078a-b766-43c9-a18e-6eb75ada4eeb for instance with vm_state building and task_state spawning. [ 640.805175] env[60400]: DEBUG nova.network.neutron [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Updating instance_info_cache with network_info: [{"id": "29b6e5e1-0961-4c22-9cd8-d8a073552857", "address": "fa:16:3e:23:0e:ed", "network": {"id": "ed0c4de2-e657-438c-b400-baf260923f2a", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.233", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c82f07917ba4819a6bcf09e15f9f9cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d054505-89d3-49c5-8b38-5da917a42c49", "external-id": "nsx-vlan-transportzone-888", "segmentation_id": 888, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap29b6e5e1-09", "ovs_interfaceid": "29b6e5e1-0961-4c22-9cd8-d8a073552857", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 640.816920] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Releasing lock "refresh_cache-cc1d534d-6a43-4575-895d-c3bef84d772e" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 640.817022] env[60400]: DEBUG nova.compute.manager [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Instance network_info: |[{"id": "29b6e5e1-0961-4c22-9cd8-d8a073552857", "address": "fa:16:3e:23:0e:ed", "network": {"id": "ed0c4de2-e657-438c-b400-baf260923f2a", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.233", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c82f07917ba4819a6bcf09e15f9f9cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d054505-89d3-49c5-8b38-5da917a42c49", "external-id": "nsx-vlan-transportzone-888", "segmentation_id": 888, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap29b6e5e1-09", "ovs_interfaceid": "29b6e5e1-0961-4c22-9cd8-d8a073552857", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 640.817390] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:23:0e:ed', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6d054505-89d3-49c5-8b38-5da917a42c49', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '29b6e5e1-0961-4c22-9cd8-d8a073552857', 'vif_model': 'vmxnet3'}] {{(pid=60400) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 640.829019] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Creating folder: Project (69b9135b39df41b49fbd80c72a9cab5c). Parent ref: group-v119075. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 640.829651] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6c99bb8f-2c2b-40f0-bcd7-32f787c2e091 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 640.844440] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Created folder: Project (69b9135b39df41b49fbd80c72a9cab5c) in parent group-v119075. [ 640.844749] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Creating folder: Instances. Parent ref: group-v119091. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 640.846411] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2007f248-e77c-4361-9c81-4e269a796c3d {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 640.856783] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Created folder: Instances in parent group-v119091. [ 640.857024] env[60400]: DEBUG oslo.service.loopingcall [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60400) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 640.857218] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Creating VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 640.857412] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b256e5e6-1bcc-408d-b368-ca2b85d7209a {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 640.885865] env[60400]: DEBUG nova.network.neutron [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Updating instance_info_cache with network_info: [{"id": "83178c1c-b6a2-44c4-b05c-c995d7e267ff", "address": "fa:16:3e:b8:bc:3c", "network": {"id": "ed0c4de2-e657-438c-b400-baf260923f2a", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.235", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c82f07917ba4819a6bcf09e15f9f9cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d054505-89d3-49c5-8b38-5da917a42c49", "external-id": "nsx-vlan-transportzone-888", "segmentation_id": 888, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap83178c1c-b6", "ovs_interfaceid": "83178c1c-b6a2-44c4-b05c-c995d7e267ff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 640.896678] env[60400]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 640.896678] env[60400]: value = "task-449767" [ 640.896678] env[60400]: _type = "Task" [ 640.896678] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 640.906561] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449767, 'name': CreateVM_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 640.908903] env[60400]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Releasing lock "refresh_cache-65bf8cf0-825c-42d8-bd78-62a6277d29d7" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 640.909207] env[60400]: DEBUG nova.compute.manager [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Instance network_info: |[{"id": "83178c1c-b6a2-44c4-b05c-c995d7e267ff", "address": "fa:16:3e:b8:bc:3c", "network": {"id": "ed0c4de2-e657-438c-b400-baf260923f2a", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.235", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c82f07917ba4819a6bcf09e15f9f9cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d054505-89d3-49c5-8b38-5da917a42c49", "external-id": "nsx-vlan-transportzone-888", "segmentation_id": 888, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap83178c1c-b6", "ovs_interfaceid": "83178c1c-b6a2-44c4-b05c-c995d7e267ff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 640.909596] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b8:bc:3c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6d054505-89d3-49c5-8b38-5da917a42c49', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '83178c1c-b6a2-44c4-b05c-c995d7e267ff', 'vif_model': 'vmxnet3'}] {{(pid=60400) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 640.919402] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Creating folder: Project (6e5c7d3736204e8eafce9963fa2a28eb). Parent ref: group-v119075. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 640.922712] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-26953687-08b6-4b98-b7f4-3e49ccb4838f {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 640.938160] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Created folder: Project (6e5c7d3736204e8eafce9963fa2a28eb) in parent group-v119075. [ 640.938160] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Creating folder: Instances. Parent ref: group-v119094. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 640.938160] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3c46675f-fa42-4d1b-97e5-647fd32cc73b {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 640.944912] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Created folder: Instances in parent group-v119094. [ 640.945649] env[60400]: DEBUG oslo.service.loopingcall [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60400) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 640.945972] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Creating VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 640.946588] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c9100627-7c09-4364-83d5-d4c13c743587 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 640.976732] env[60400]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 640.976732] env[60400]: value = "task-449770" [ 640.976732] env[60400]: _type = "Task" [ 640.976732] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 640.987504] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449770, 'name': CreateVM_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 641.065913] env[60400]: DEBUG nova.network.neutron [req-ddea34ae-5c8b-47a0-a1e1-9c5b7f341f3b req-9a05312c-c99f-4bc9-a85e-34e1917074e9 service nova] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Updated VIF entry in instance network info cache for port de57e989-a8d1-4474-ba2a-45abd1d9c209. {{(pid=60400) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} [ 641.066316] env[60400]: DEBUG nova.network.neutron [req-ddea34ae-5c8b-47a0-a1e1-9c5b7f341f3b req-9a05312c-c99f-4bc9-a85e-34e1917074e9 service nova] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Updating instance_info_cache with network_info: [{"id": "de57e989-a8d1-4474-ba2a-45abd1d9c209", "address": "fa:16:3e:24:67:ec", "network": {"id": "ed0c4de2-e657-438c-b400-baf260923f2a", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c82f07917ba4819a6bcf09e15f9f9cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d054505-89d3-49c5-8b38-5da917a42c49", "external-id": "nsx-vlan-transportzone-888", "segmentation_id": 888, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapde57e989-a8", "ovs_interfaceid": "de57e989-a8d1-4474-ba2a-45abd1d9c209", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 641.082670] env[60400]: DEBUG oslo_concurrency.lockutils [req-ddea34ae-5c8b-47a0-a1e1-9c5b7f341f3b req-9a05312c-c99f-4bc9-a85e-34e1917074e9 service nova] Releasing lock "refresh_cache-30c40353-01fe-407d-8d56-0f6c166d12e3" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 641.085091] env[60400]: DEBUG nova.compute.manager [req-ddea34ae-5c8b-47a0-a1e1-9c5b7f341f3b req-9a05312c-c99f-4bc9-a85e-34e1917074e9 service nova] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Received event network-vif-plugged-e3346b4f-9fac-4a32-9f5c-0bd441628fb1 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 641.085091] env[60400]: DEBUG oslo_concurrency.lockutils [req-ddea34ae-5c8b-47a0-a1e1-9c5b7f341f3b req-9a05312c-c99f-4bc9-a85e-34e1917074e9 service nova] Acquiring lock "130961ce-1e22-4320-abc9-30fc5f652be3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 641.085091] env[60400]: DEBUG oslo_concurrency.lockutils [req-ddea34ae-5c8b-47a0-a1e1-9c5b7f341f3b req-9a05312c-c99f-4bc9-a85e-34e1917074e9 service nova] Lock "130961ce-1e22-4320-abc9-30fc5f652be3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 641.085091] env[60400]: DEBUG oslo_concurrency.lockutils [req-ddea34ae-5c8b-47a0-a1e1-9c5b7f341f3b req-9a05312c-c99f-4bc9-a85e-34e1917074e9 service nova] Lock "130961ce-1e22-4320-abc9-30fc5f652be3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 641.085681] env[60400]: DEBUG nova.compute.manager [req-ddea34ae-5c8b-47a0-a1e1-9c5b7f341f3b req-9a05312c-c99f-4bc9-a85e-34e1917074e9 service nova] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] No waiting events found dispatching network-vif-plugged-e3346b4f-9fac-4a32-9f5c-0bd441628fb1 {{(pid=60400) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 641.085681] env[60400]: WARNING nova.compute.manager [req-ddea34ae-5c8b-47a0-a1e1-9c5b7f341f3b req-9a05312c-c99f-4bc9-a85e-34e1917074e9 service nova] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Received unexpected event network-vif-plugged-e3346b4f-9fac-4a32-9f5c-0bd441628fb1 for instance with vm_state building and task_state spawning. [ 641.085681] env[60400]: DEBUG nova.compute.manager [req-ddea34ae-5c8b-47a0-a1e1-9c5b7f341f3b req-9a05312c-c99f-4bc9-a85e-34e1917074e9 service nova] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Received event network-changed-e3346b4f-9fac-4a32-9f5c-0bd441628fb1 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 641.085681] env[60400]: DEBUG nova.compute.manager [req-ddea34ae-5c8b-47a0-a1e1-9c5b7f341f3b req-9a05312c-c99f-4bc9-a85e-34e1917074e9 service nova] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Refreshing instance network info cache due to event network-changed-e3346b4f-9fac-4a32-9f5c-0bd441628fb1. {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 641.085681] env[60400]: DEBUG oslo_concurrency.lockutils [req-ddea34ae-5c8b-47a0-a1e1-9c5b7f341f3b req-9a05312c-c99f-4bc9-a85e-34e1917074e9 service nova] Acquiring lock "refresh_cache-130961ce-1e22-4320-abc9-30fc5f652be3" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 641.086083] env[60400]: DEBUG oslo_concurrency.lockutils [req-ddea34ae-5c8b-47a0-a1e1-9c5b7f341f3b req-9a05312c-c99f-4bc9-a85e-34e1917074e9 service nova] Acquired lock "refresh_cache-130961ce-1e22-4320-abc9-30fc5f652be3" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 641.086083] env[60400]: DEBUG nova.network.neutron [req-ddea34ae-5c8b-47a0-a1e1-9c5b7f341f3b req-9a05312c-c99f-4bc9-a85e-34e1917074e9 service nova] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Refreshing network info cache for port e3346b4f-9fac-4a32-9f5c-0bd441628fb1 {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} [ 641.410370] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449767, 'name': CreateVM_Task, 'duration_secs': 0.337601} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 641.411420] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Created VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 641.412448] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 641.412660] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 641.412993] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 641.413441] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bb0f0680-59d5-4cc2-97cc-f68abb15c4ed {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 641.419060] env[60400]: DEBUG oslo_vmware.api [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Waiting for the task: (returnval){ [ 641.419060] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]529147d2-c332-23e6-a668-f87caac05f0f" [ 641.419060] env[60400]: _type = "Task" [ 641.419060] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 641.428658] env[60400]: DEBUG oslo_vmware.api [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]529147d2-c332-23e6-a668-f87caac05f0f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 641.486805] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449770, 'name': CreateVM_Task, 'duration_secs': 0.301175} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 641.487044] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Created VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 641.487679] env[60400]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 641.935394] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 641.936563] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Processing image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 641.936563] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 641.936563] env[60400]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 641.936563] env[60400]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 641.936865] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-36b46841-0498-4cd2-a805-505a1286ce11 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 641.941933] env[60400]: DEBUG oslo_vmware.api [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Waiting for the task: (returnval){ [ 641.941933] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]520a3bd0-360f-a141-1bf3-6cf4eaa3a929" [ 641.941933] env[60400]: _type = "Task" [ 641.941933] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 641.954098] env[60400]: DEBUG oslo_vmware.api [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]520a3bd0-360f-a141-1bf3-6cf4eaa3a929, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 642.015324] env[60400]: DEBUG nova.network.neutron [req-ddea34ae-5c8b-47a0-a1e1-9c5b7f341f3b req-9a05312c-c99f-4bc9-a85e-34e1917074e9 service nova] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Updated VIF entry in instance network info cache for port e3346b4f-9fac-4a32-9f5c-0bd441628fb1. {{(pid=60400) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} [ 642.015705] env[60400]: DEBUG nova.network.neutron [req-ddea34ae-5c8b-47a0-a1e1-9c5b7f341f3b req-9a05312c-c99f-4bc9-a85e-34e1917074e9 service nova] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Updating instance_info_cache with network_info: [{"id": "e3346b4f-9fac-4a32-9f5c-0bd441628fb1", "address": "fa:16:3e:99:0c:3e", "network": {"id": "107b9535-b774-4def-b614-5b4cdda24022", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-272790240-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "367973317efd4063b56c3f337ad62856", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8c58d99d-ec12-4fc3-ab39-042b3f8cbb89", "external-id": "nsx-vlan-transportzone-44", "segmentation_id": 44, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape3346b4f-9f", "ovs_interfaceid": "e3346b4f-9fac-4a32-9f5c-0bd441628fb1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 642.028896] env[60400]: DEBUG oslo_concurrency.lockutils [req-ddea34ae-5c8b-47a0-a1e1-9c5b7f341f3b req-9a05312c-c99f-4bc9-a85e-34e1917074e9 service nova] Releasing lock "refresh_cache-130961ce-1e22-4320-abc9-30fc5f652be3" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 642.459877] env[60400]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 642.460226] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Processing image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 642.460490] env[60400]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 644.179816] env[60400]: DEBUG nova.compute.manager [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Received event network-vif-plugged-4eca35f8-e2f3-4bf1-a56a-851182d59348 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 644.180375] env[60400]: DEBUG oslo_concurrency.lockutils [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] Acquiring lock "a45f24ab-afe1-4ffd-a917-11b68a0b29ec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 644.180729] env[60400]: DEBUG oslo_concurrency.lockutils [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] Lock "a45f24ab-afe1-4ffd-a917-11b68a0b29ec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 644.181034] env[60400]: DEBUG oslo_concurrency.lockutils [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] Lock "a45f24ab-afe1-4ffd-a917-11b68a0b29ec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 644.181460] env[60400]: DEBUG nova.compute.manager [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] No waiting events found dispatching network-vif-plugged-4eca35f8-e2f3-4bf1-a56a-851182d59348 {{(pid=60400) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 644.181747] env[60400]: WARNING nova.compute.manager [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Received unexpected event network-vif-plugged-4eca35f8-e2f3-4bf1-a56a-851182d59348 for instance with vm_state building and task_state spawning. [ 644.182116] env[60400]: DEBUG nova.compute.manager [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Received event network-changed-ba27078a-b766-43c9-a18e-6eb75ada4eeb {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 644.182539] env[60400]: DEBUG nova.compute.manager [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Refreshing instance network info cache due to event network-changed-ba27078a-b766-43c9-a18e-6eb75ada4eeb. {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 644.182906] env[60400]: DEBUG oslo_concurrency.lockutils [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] Acquiring lock "refresh_cache-4540cd82-440c-41e3-8bfa-b384da6fc964" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 644.185030] env[60400]: DEBUG oslo_concurrency.lockutils [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] Acquired lock "refresh_cache-4540cd82-440c-41e3-8bfa-b384da6fc964" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 644.185030] env[60400]: DEBUG nova.network.neutron [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Refreshing network info cache for port ba27078a-b766-43c9-a18e-6eb75ada4eeb {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} [ 645.079164] env[60400]: DEBUG nova.compute.manager [req-1836472d-8047-4b80-b0c7-8dfa509d3606 req-a2357895-28f2-4588-87e5-22438743f15f service nova] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Received event network-vif-plugged-83178c1c-b6a2-44c4-b05c-c995d7e267ff {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 645.079414] env[60400]: DEBUG oslo_concurrency.lockutils [req-1836472d-8047-4b80-b0c7-8dfa509d3606 req-a2357895-28f2-4588-87e5-22438743f15f service nova] Acquiring lock "65bf8cf0-825c-42d8-bd78-62a6277d29d7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 645.079646] env[60400]: DEBUG oslo_concurrency.lockutils [req-1836472d-8047-4b80-b0c7-8dfa509d3606 req-a2357895-28f2-4588-87e5-22438743f15f service nova] Lock "65bf8cf0-825c-42d8-bd78-62a6277d29d7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 645.079788] env[60400]: DEBUG oslo_concurrency.lockutils [req-1836472d-8047-4b80-b0c7-8dfa509d3606 req-a2357895-28f2-4588-87e5-22438743f15f service nova] Lock "65bf8cf0-825c-42d8-bd78-62a6277d29d7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 645.082021] env[60400]: DEBUG nova.compute.manager [req-1836472d-8047-4b80-b0c7-8dfa509d3606 req-a2357895-28f2-4588-87e5-22438743f15f service nova] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] No waiting events found dispatching network-vif-plugged-83178c1c-b6a2-44c4-b05c-c995d7e267ff {{(pid=60400) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 645.087459] env[60400]: WARNING nova.compute.manager [req-1836472d-8047-4b80-b0c7-8dfa509d3606 req-a2357895-28f2-4588-87e5-22438743f15f service nova] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Received unexpected event network-vif-plugged-83178c1c-b6a2-44c4-b05c-c995d7e267ff for instance with vm_state building and task_state spawning. [ 645.087459] env[60400]: DEBUG nova.compute.manager [req-1836472d-8047-4b80-b0c7-8dfa509d3606 req-a2357895-28f2-4588-87e5-22438743f15f service nova] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Received event network-changed-83178c1c-b6a2-44c4-b05c-c995d7e267ff {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 645.087459] env[60400]: DEBUG nova.compute.manager [req-1836472d-8047-4b80-b0c7-8dfa509d3606 req-a2357895-28f2-4588-87e5-22438743f15f service nova] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Refreshing instance network info cache due to event network-changed-83178c1c-b6a2-44c4-b05c-c995d7e267ff. {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 645.087459] env[60400]: DEBUG oslo_concurrency.lockutils [req-1836472d-8047-4b80-b0c7-8dfa509d3606 req-a2357895-28f2-4588-87e5-22438743f15f service nova] Acquiring lock "refresh_cache-65bf8cf0-825c-42d8-bd78-62a6277d29d7" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 645.088444] env[60400]: DEBUG oslo_concurrency.lockutils [req-1836472d-8047-4b80-b0c7-8dfa509d3606 req-a2357895-28f2-4588-87e5-22438743f15f service nova] Acquired lock "refresh_cache-65bf8cf0-825c-42d8-bd78-62a6277d29d7" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 645.088444] env[60400]: DEBUG nova.network.neutron [req-1836472d-8047-4b80-b0c7-8dfa509d3606 req-a2357895-28f2-4588-87e5-22438743f15f service nova] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Refreshing network info cache for port 83178c1c-b6a2-44c4-b05c-c995d7e267ff {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} [ 645.357048] env[60400]: DEBUG nova.network.neutron [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Updated VIF entry in instance network info cache for port ba27078a-b766-43c9-a18e-6eb75ada4eeb. {{(pid=60400) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} [ 645.357332] env[60400]: DEBUG nova.network.neutron [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Updating instance_info_cache with network_info: [{"id": "ba27078a-b766-43c9-a18e-6eb75ada4eeb", "address": "fa:16:3e:63:9b:6a", "network": {"id": "ed0c4de2-e657-438c-b400-baf260923f2a", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.224", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c82f07917ba4819a6bcf09e15f9f9cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d054505-89d3-49c5-8b38-5da917a42c49", "external-id": "nsx-vlan-transportzone-888", "segmentation_id": 888, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapba27078a-b7", "ovs_interfaceid": "ba27078a-b766-43c9-a18e-6eb75ada4eeb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 645.377036] env[60400]: DEBUG oslo_concurrency.lockutils [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] Releasing lock "refresh_cache-4540cd82-440c-41e3-8bfa-b384da6fc964" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 645.377036] env[60400]: DEBUG nova.compute.manager [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Received event network-changed-4eca35f8-e2f3-4bf1-a56a-851182d59348 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 645.377036] env[60400]: DEBUG nova.compute.manager [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Refreshing instance network info cache due to event network-changed-4eca35f8-e2f3-4bf1-a56a-851182d59348. {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 645.377036] env[60400]: DEBUG oslo_concurrency.lockutils [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] Acquiring lock "refresh_cache-a45f24ab-afe1-4ffd-a917-11b68a0b29ec" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 645.377036] env[60400]: DEBUG oslo_concurrency.lockutils [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] Acquired lock "refresh_cache-a45f24ab-afe1-4ffd-a917-11b68a0b29ec" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 645.377276] env[60400]: DEBUG nova.network.neutron [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Refreshing network info cache for port 4eca35f8-e2f3-4bf1-a56a-851182d59348 {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} [ 645.525052] env[60400]: DEBUG nova.network.neutron [req-1836472d-8047-4b80-b0c7-8dfa509d3606 req-a2357895-28f2-4588-87e5-22438743f15f service nova] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Updated VIF entry in instance network info cache for port 83178c1c-b6a2-44c4-b05c-c995d7e267ff. {{(pid=60400) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} [ 645.525052] env[60400]: DEBUG nova.network.neutron [req-1836472d-8047-4b80-b0c7-8dfa509d3606 req-a2357895-28f2-4588-87e5-22438743f15f service nova] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Updating instance_info_cache with network_info: [{"id": "83178c1c-b6a2-44c4-b05c-c995d7e267ff", "address": "fa:16:3e:b8:bc:3c", "network": {"id": "ed0c4de2-e657-438c-b400-baf260923f2a", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.235", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c82f07917ba4819a6bcf09e15f9f9cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d054505-89d3-49c5-8b38-5da917a42c49", "external-id": "nsx-vlan-transportzone-888", "segmentation_id": 888, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap83178c1c-b6", "ovs_interfaceid": "83178c1c-b6a2-44c4-b05c-c995d7e267ff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 645.537855] env[60400]: DEBUG oslo_concurrency.lockutils [req-1836472d-8047-4b80-b0c7-8dfa509d3606 req-a2357895-28f2-4588-87e5-22438743f15f service nova] Releasing lock "refresh_cache-65bf8cf0-825c-42d8-bd78-62a6277d29d7" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 646.425089] env[60400]: DEBUG nova.network.neutron [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Updated VIF entry in instance network info cache for port 4eca35f8-e2f3-4bf1-a56a-851182d59348. {{(pid=60400) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} [ 646.425089] env[60400]: DEBUG nova.network.neutron [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Updating instance_info_cache with network_info: [{"id": "4eca35f8-e2f3-4bf1-a56a-851182d59348", "address": "fa:16:3e:bd:8c:05", "network": {"id": "ed0c4de2-e657-438c-b400-baf260923f2a", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.165", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c82f07917ba4819a6bcf09e15f9f9cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d054505-89d3-49c5-8b38-5da917a42c49", "external-id": "nsx-vlan-transportzone-888", "segmentation_id": 888, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4eca35f8-e2", "ovs_interfaceid": "4eca35f8-e2f3-4bf1-a56a-851182d59348", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 646.437938] env[60400]: DEBUG oslo_concurrency.lockutils [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] Releasing lock "refresh_cache-a45f24ab-afe1-4ffd-a917-11b68a0b29ec" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 646.438208] env[60400]: DEBUG nova.compute.manager [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Received event network-vif-plugged-29b6e5e1-0961-4c22-9cd8-d8a073552857 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 646.438391] env[60400]: DEBUG oslo_concurrency.lockutils [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] Acquiring lock "cc1d534d-6a43-4575-895d-c3bef84d772e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 646.438581] env[60400]: DEBUG oslo_concurrency.lockutils [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] Lock "cc1d534d-6a43-4575-895d-c3bef84d772e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 646.438730] env[60400]: DEBUG oslo_concurrency.lockutils [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] Lock "cc1d534d-6a43-4575-895d-c3bef84d772e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 646.438881] env[60400]: DEBUG nova.compute.manager [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] No waiting events found dispatching network-vif-plugged-29b6e5e1-0961-4c22-9cd8-d8a073552857 {{(pid=60400) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 646.439047] env[60400]: WARNING nova.compute.manager [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Received unexpected event network-vif-plugged-29b6e5e1-0961-4c22-9cd8-d8a073552857 for instance with vm_state building and task_state spawning. [ 646.439205] env[60400]: DEBUG nova.compute.manager [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Received event network-changed-29b6e5e1-0961-4c22-9cd8-d8a073552857 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 646.439347] env[60400]: DEBUG nova.compute.manager [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Refreshing instance network info cache due to event network-changed-29b6e5e1-0961-4c22-9cd8-d8a073552857. {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 646.439562] env[60400]: DEBUG oslo_concurrency.lockutils [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] Acquiring lock "refresh_cache-cc1d534d-6a43-4575-895d-c3bef84d772e" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 646.439698] env[60400]: DEBUG oslo_concurrency.lockutils [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] Acquired lock "refresh_cache-cc1d534d-6a43-4575-895d-c3bef84d772e" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 646.439847] env[60400]: DEBUG nova.network.neutron [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Refreshing network info cache for port 29b6e5e1-0961-4c22-9cd8-d8a073552857 {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} [ 647.187621] env[60400]: DEBUG nova.network.neutron [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Updated VIF entry in instance network info cache for port 29b6e5e1-0961-4c22-9cd8-d8a073552857. {{(pid=60400) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} [ 647.187894] env[60400]: DEBUG nova.network.neutron [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Updating instance_info_cache with network_info: [{"id": "29b6e5e1-0961-4c22-9cd8-d8a073552857", "address": "fa:16:3e:23:0e:ed", "network": {"id": "ed0c4de2-e657-438c-b400-baf260923f2a", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.233", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c82f07917ba4819a6bcf09e15f9f9cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d054505-89d3-49c5-8b38-5da917a42c49", "external-id": "nsx-vlan-transportzone-888", "segmentation_id": 888, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap29b6e5e1-09", "ovs_interfaceid": "29b6e5e1-0961-4c22-9cd8-d8a073552857", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 647.201573] env[60400]: DEBUG oslo_concurrency.lockutils [req-35183d41-53b9-4f17-ba1a-b6b426b37fa5 req-616666fc-3cf8-4d7e-9fc8-a262b7cff4bc service nova] Releasing lock "refresh_cache-cc1d534d-6a43-4575-895d-c3bef84d772e" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 647.261583] env[60400]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Acquiring lock "e4f0342a-4169-40aa-b234-a2e2340d5b05" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 647.262138] env[60400]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Lock "e4f0342a-4169-40aa-b234-a2e2340d5b05" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 647.278412] env[60400]: DEBUG nova.compute.manager [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Starting instance... {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 647.344664] env[60400]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 647.344896] env[60400]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 647.347544] env[60400]: INFO nova.compute.claims [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 647.604929] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d37897d-024a-4e92-9507-2b2f17f1d6ed {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 647.617173] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6ffabfd-bff5-4b9b-9263-1475919c48d5 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 647.662174] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f61c4b03-e2f0-4e6a-a73a-63ca329cfeba {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 647.669909] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46a0bbc2-09c7-4160-b1a5-ddee0e0041fc {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 647.685775] env[60400]: DEBUG nova.compute.provider_tree [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 647.701067] env[60400]: DEBUG nova.scheduler.client.report [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 647.719960] env[60400]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.375s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 647.721653] env[60400]: DEBUG nova.compute.manager [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Start building networks asynchronously for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 647.765678] env[60400]: DEBUG nova.compute.utils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Using /dev/sd instead of None {{(pid=60400) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 647.767769] env[60400]: DEBUG nova.compute.manager [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Allocating IP information in the background. {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 647.768106] env[60400]: DEBUG nova.network.neutron [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] allocate_for_instance() {{(pid=60400) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 647.782733] env[60400]: DEBUG nova.compute.manager [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Start building block device mappings for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 647.873506] env[60400]: DEBUG nova.policy [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '56c77841712b4c63b744211d28c87fb7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c2d9ebe5ad8545b9ade33e18b6092a47', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60400) authorize /opt/stack/nova/nova/policy.py:203}} [ 647.882707] env[60400]: DEBUG nova.compute.manager [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Start spawning the instance on the hypervisor. {{(pid=60400) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 647.914414] env[60400]: DEBUG nova.virt.hardware [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T04:32:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T04:32:17Z,direct_url=,disk_format='vmdk',id=f5dfd970-7a56-4489-873c-2c3b6fbd9fe9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c82f07917ba4819a6bcf09e15f9f9cf',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T04:32:18Z,virtual_size=,visibility=), allow threads: False {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 647.914646] env[60400]: DEBUG nova.virt.hardware [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Flavor limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 647.914793] env[60400]: DEBUG nova.virt.hardware [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Image limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 647.914961] env[60400]: DEBUG nova.virt.hardware [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Flavor pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 647.915108] env[60400]: DEBUG nova.virt.hardware [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Image pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 647.915241] env[60400]: DEBUG nova.virt.hardware [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 647.915445] env[60400]: DEBUG nova.virt.hardware [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 647.915595] env[60400]: DEBUG nova.virt.hardware [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 647.915748] env[60400]: DEBUG nova.virt.hardware [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Got 1 possible topologies {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 647.915928] env[60400]: DEBUG nova.virt.hardware [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 647.916062] env[60400]: DEBUG nova.virt.hardware [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 647.916928] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df0815c0-6c9b-48c6-8ef8-f3b6f1e3a5a2 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 647.925665] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1f16bf0-0076-461e-a43f-36bd7de4bd8e {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 648.559338] env[60400]: DEBUG nova.network.neutron [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Successfully created port: e6bca69a-edb2-4b21-96fb-d7f0db997395 {{(pid=60400) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 649.771150] env[60400]: DEBUG nova.network.neutron [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Successfully updated port: e6bca69a-edb2-4b21-96fb-d7f0db997395 {{(pid=60400) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 649.785307] env[60400]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Acquiring lock "refresh_cache-e4f0342a-4169-40aa-b234-a2e2340d5b05" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 649.785400] env[60400]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Acquired lock "refresh_cache-e4f0342a-4169-40aa-b234-a2e2340d5b05" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 649.785731] env[60400]: DEBUG nova.network.neutron [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Building network info cache for instance {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} [ 649.872037] env[60400]: DEBUG nova.network.neutron [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Instance cache missing network info. {{(pid=60400) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 650.624523] env[60400]: DEBUG nova.network.neutron [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Updating instance_info_cache with network_info: [{"id": "e6bca69a-edb2-4b21-96fb-d7f0db997395", "address": "fa:16:3e:91:4f:35", "network": {"id": "27cb7cb6-6076-4539-bd39-ceb8e2df4126", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-954404858-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c2d9ebe5ad8545b9ade33e18b6092a47", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fb224918-e294-4b76-80f9-2fa0031b7dc2", "external-id": "nsx-vlan-transportzone-876", "segmentation_id": 876, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape6bca69a-ed", "ovs_interfaceid": "e6bca69a-edb2-4b21-96fb-d7f0db997395", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 650.645014] env[60400]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Releasing lock "refresh_cache-e4f0342a-4169-40aa-b234-a2e2340d5b05" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 650.645014] env[60400]: DEBUG nova.compute.manager [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Instance network_info: |[{"id": "e6bca69a-edb2-4b21-96fb-d7f0db997395", "address": "fa:16:3e:91:4f:35", "network": {"id": "27cb7cb6-6076-4539-bd39-ceb8e2df4126", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-954404858-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c2d9ebe5ad8545b9ade33e18b6092a47", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fb224918-e294-4b76-80f9-2fa0031b7dc2", "external-id": "nsx-vlan-transportzone-876", "segmentation_id": 876, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape6bca69a-ed", "ovs_interfaceid": "e6bca69a-edb2-4b21-96fb-d7f0db997395", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 650.645182] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:91:4f:35', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'fb224918-e294-4b76-80f9-2fa0031b7dc2', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e6bca69a-edb2-4b21-96fb-d7f0db997395', 'vif_model': 'vmxnet3'}] {{(pid=60400) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 650.653425] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Creating folder: Project (c2d9ebe5ad8545b9ade33e18b6092a47). Parent ref: group-v119075. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 650.654794] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ba1fbcac-52fc-403d-a8cd-a4a2a7ac1bad {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 650.667140] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Created folder: Project (c2d9ebe5ad8545b9ade33e18b6092a47) in parent group-v119075. [ 650.667341] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Creating folder: Instances. Parent ref: group-v119097. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 650.667675] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6ae649ce-3108-4a09-bec2-2d523be566d0 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 650.680685] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Created folder: Instances in parent group-v119097. [ 650.680947] env[60400]: DEBUG oslo.service.loopingcall [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60400) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 650.681155] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Creating VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 650.681370] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a72d55de-1eb8-4f5b-ad3f-9b70bcf258e5 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 650.703377] env[60400]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 650.703377] env[60400]: value = "task-449773" [ 650.703377] env[60400]: _type = "Task" [ 650.703377] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 650.711624] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449773, 'name': CreateVM_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 651.164324] env[60400]: DEBUG nova.compute.manager [req-b97df384-ea4e-4201-873b-9bdb452b12a5 req-f2202907-1c54-49b8-95c9-f5a6b59ea35e service nova] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Received event network-vif-plugged-e6bca69a-edb2-4b21-96fb-d7f0db997395 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 651.165425] env[60400]: DEBUG oslo_concurrency.lockutils [req-b97df384-ea4e-4201-873b-9bdb452b12a5 req-f2202907-1c54-49b8-95c9-f5a6b59ea35e service nova] Acquiring lock "e4f0342a-4169-40aa-b234-a2e2340d5b05-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 651.165699] env[60400]: DEBUG oslo_concurrency.lockutils [req-b97df384-ea4e-4201-873b-9bdb452b12a5 req-f2202907-1c54-49b8-95c9-f5a6b59ea35e service nova] Lock "e4f0342a-4169-40aa-b234-a2e2340d5b05-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 651.165823] env[60400]: DEBUG oslo_concurrency.lockutils [req-b97df384-ea4e-4201-873b-9bdb452b12a5 req-f2202907-1c54-49b8-95c9-f5a6b59ea35e service nova] Lock "e4f0342a-4169-40aa-b234-a2e2340d5b05-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 651.165999] env[60400]: DEBUG nova.compute.manager [req-b97df384-ea4e-4201-873b-9bdb452b12a5 req-f2202907-1c54-49b8-95c9-f5a6b59ea35e service nova] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] No waiting events found dispatching network-vif-plugged-e6bca69a-edb2-4b21-96fb-d7f0db997395 {{(pid=60400) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 651.166158] env[60400]: WARNING nova.compute.manager [req-b97df384-ea4e-4201-873b-9bdb452b12a5 req-f2202907-1c54-49b8-95c9-f5a6b59ea35e service nova] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Received unexpected event network-vif-plugged-e6bca69a-edb2-4b21-96fb-d7f0db997395 for instance with vm_state building and task_state spawning. [ 651.214926] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449773, 'name': CreateVM_Task, 'duration_secs': 0.288985} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 651.215286] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Created VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 651.215778] env[60400]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 651.215933] env[60400]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 651.216275] env[60400]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 651.216518] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6dbcb447-8721-4da0-bee3-ec81802afc29 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 651.221366] env[60400]: DEBUG oslo_vmware.api [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Waiting for the task: (returnval){ [ 651.221366] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]521c0217-cb40-4bad-9e4e-cac4673f5884" [ 651.221366] env[60400]: _type = "Task" [ 651.221366] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 651.231388] env[60400]: DEBUG oslo_vmware.api [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]521c0217-cb40-4bad-9e4e-cac4673f5884, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 651.732925] env[60400]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 651.732925] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Processing image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 651.732925] env[60400]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 655.342850] env[60400]: DEBUG nova.compute.manager [req-4607b0a9-b4e9-4725-a17c-7f42de365ba7 req-942f2394-c6e7-4c8e-9ba0-230f2a67de75 service nova] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Received event network-changed-e6bca69a-edb2-4b21-96fb-d7f0db997395 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 655.342850] env[60400]: DEBUG nova.compute.manager [req-4607b0a9-b4e9-4725-a17c-7f42de365ba7 req-942f2394-c6e7-4c8e-9ba0-230f2a67de75 service nova] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Refreshing instance network info cache due to event network-changed-e6bca69a-edb2-4b21-96fb-d7f0db997395. {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 655.343412] env[60400]: DEBUG oslo_concurrency.lockutils [req-4607b0a9-b4e9-4725-a17c-7f42de365ba7 req-942f2394-c6e7-4c8e-9ba0-230f2a67de75 service nova] Acquiring lock "refresh_cache-e4f0342a-4169-40aa-b234-a2e2340d5b05" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 655.343772] env[60400]: DEBUG oslo_concurrency.lockutils [req-4607b0a9-b4e9-4725-a17c-7f42de365ba7 req-942f2394-c6e7-4c8e-9ba0-230f2a67de75 service nova] Acquired lock "refresh_cache-e4f0342a-4169-40aa-b234-a2e2340d5b05" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 655.344082] env[60400]: DEBUG nova.network.neutron [req-4607b0a9-b4e9-4725-a17c-7f42de365ba7 req-942f2394-c6e7-4c8e-9ba0-230f2a67de75 service nova] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Refreshing network info cache for port e6bca69a-edb2-4b21-96fb-d7f0db997395 {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} [ 656.036321] env[60400]: DEBUG nova.network.neutron [req-4607b0a9-b4e9-4725-a17c-7f42de365ba7 req-942f2394-c6e7-4c8e-9ba0-230f2a67de75 service nova] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Updated VIF entry in instance network info cache for port e6bca69a-edb2-4b21-96fb-d7f0db997395. {{(pid=60400) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} [ 656.036321] env[60400]: DEBUG nova.network.neutron [req-4607b0a9-b4e9-4725-a17c-7f42de365ba7 req-942f2394-c6e7-4c8e-9ba0-230f2a67de75 service nova] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Updating instance_info_cache with network_info: [{"id": "e6bca69a-edb2-4b21-96fb-d7f0db997395", "address": "fa:16:3e:91:4f:35", "network": {"id": "27cb7cb6-6076-4539-bd39-ceb8e2df4126", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-954404858-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c2d9ebe5ad8545b9ade33e18b6092a47", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fb224918-e294-4b76-80f9-2fa0031b7dc2", "external-id": "nsx-vlan-transportzone-876", "segmentation_id": 876, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape6bca69a-ed", "ovs_interfaceid": "e6bca69a-edb2-4b21-96fb-d7f0db997395", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 656.050660] env[60400]: DEBUG oslo_concurrency.lockutils [req-4607b0a9-b4e9-4725-a17c-7f42de365ba7 req-942f2394-c6e7-4c8e-9ba0-230f2a67de75 service nova] Releasing lock "refresh_cache-e4f0342a-4169-40aa-b234-a2e2340d5b05" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 681.333509] env[60400]: WARNING oslo_vmware.rw_handles [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 681.333509] env[60400]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 681.333509] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 681.333509] env[60400]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 681.333509] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 681.333509] env[60400]: ERROR oslo_vmware.rw_handles response.begin() [ 681.333509] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 681.333509] env[60400]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 681.333509] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 681.333509] env[60400]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 681.333509] env[60400]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 681.333509] env[60400]: ERROR oslo_vmware.rw_handles [ 681.334222] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Downloaded image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to vmware_temp/677a0c20-4765-4d88-8ade-b11df6169364/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 681.335430] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Caching image {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 681.338141] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Copying Virtual Disk [datastore1] vmware_temp/677a0c20-4765-4d88-8ade-b11df6169364/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk to [datastore1] vmware_temp/677a0c20-4765-4d88-8ade-b11df6169364/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk {{(pid=60400) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 681.338141] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-48fd6615-2443-40c9-ba83-219907945cbd {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.348064] env[60400]: DEBUG oslo_vmware.api [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Waiting for the task: (returnval){ [ 681.348064] env[60400]: value = "task-449783" [ 681.348064] env[60400]: _type = "Task" [ 681.348064] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 681.361038] env[60400]: DEBUG oslo_vmware.api [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Task: {'id': task-449783, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 681.866648] env[60400]: DEBUG oslo_vmware.exceptions [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Fault InvalidArgument not matched. {{(pid=60400) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 681.866648] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 681.868064] env[60400]: ERROR nova.compute.manager [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 681.868064] env[60400]: Faults: ['InvalidArgument'] [ 681.868064] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Traceback (most recent call last): [ 681.868064] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 681.868064] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] yield resources [ 681.868064] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 681.868064] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] self.driver.spawn(context, instance, image_meta, [ 681.868064] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 681.868064] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 681.868064] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 681.868064] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] self._fetch_image_if_missing(context, vi) [ 681.868064] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 681.868495] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] image_cache(vi, tmp_image_ds_loc) [ 681.868495] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 681.868495] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] vm_util.copy_virtual_disk( [ 681.868495] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 681.868495] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] session._wait_for_task(vmdk_copy_task) [ 681.868495] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 681.868495] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] return self.wait_for_task(task_ref) [ 681.868495] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 681.868495] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] return evt.wait() [ 681.868495] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 681.868495] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] result = hub.switch() [ 681.868495] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 681.868495] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] return self.greenlet.switch() [ 681.868919] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 681.868919] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] self.f(*self.args, **self.kw) [ 681.868919] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 681.868919] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] raise exceptions.translate_fault(task_info.error) [ 681.868919] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 681.868919] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Faults: ['InvalidArgument'] [ 681.868919] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] [ 681.868919] env[60400]: INFO nova.compute.manager [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Terminating instance [ 681.871229] env[60400]: DEBUG nova.compute.manager [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Start destroying the instance on the hypervisor. {{(pid=60400) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 681.871547] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Destroying instance {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 681.871693] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 681.871871] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 681.873068] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20f6a7a7-bdc4-402f-8290-c0811ec2905f {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.880218] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0447b445-a793-4816-94a5-f32608e280de {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.888444] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Unregistering the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 681.890404] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-88c0d67e-84e4-4a92-8083-c50e0dae884c {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.891648] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 681.891745] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60400) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 681.892434] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-59a9f343-58ab-45f6-bbb0-1c898deeaca6 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.900175] env[60400]: DEBUG oslo_vmware.api [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Waiting for the task: (returnval){ [ 681.900175] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]5260de57-0dae-51a9-932c-85505230261f" [ 681.900175] env[60400]: _type = "Task" [ 681.900175] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 681.905833] env[60400]: DEBUG oslo_vmware.api [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]5260de57-0dae-51a9-932c-85505230261f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 682.412880] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Preparing fetch location {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 682.414395] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Creating directory with path [datastore1] vmware_temp/18aa9928-b05c-45e1-880a-757e1d0859cf/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 682.414395] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-43b16ec7-bcf4-41f7-a4f2-27a2116c6f6b {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 682.428374] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Created directory with path [datastore1] vmware_temp/18aa9928-b05c-45e1-880a-757e1d0859cf/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 682.428654] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Fetch image to [datastore1] vmware_temp/18aa9928-b05c-45e1-880a-757e1d0859cf/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 682.428820] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to [datastore1] vmware_temp/18aa9928-b05c-45e1-880a-757e1d0859cf/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 682.429730] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a004d77-d9b0-4b9a-92d8-61fef269497d {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 682.439693] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6995d996-d57a-4c4f-889f-ce14eb6f9bd6 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 682.450999] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-652348c9-bdd2-4986-97f0-0ecb479c882c {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 682.494159] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bab74cf6-7c13-45e0-b9fa-bbb0e1936058 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 682.502488] env[60400]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a3f6474d-6f2c-4e67-92b1-0476c2b4be31 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 682.594140] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 682.673865] env[60400]: DEBUG oslo_vmware.rw_handles [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/18aa9928-b05c-45e1-880a-757e1d0859cf/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60400) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 682.742340] env[60400]: DEBUG oslo_vmware.rw_handles [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Completed reading data from the image iterator. {{(pid=60400) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 682.742527] env[60400]: DEBUG oslo_vmware.rw_handles [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/18aa9928-b05c-45e1-880a-757e1d0859cf/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60400) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 685.412219] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Unregistered the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 685.412563] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Deleting contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 685.412750] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Deleting the datastore file [datastore1] 148f525a-f3c0-40f2-8527-9607cd5e581b {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 685.413047] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-385b219b-028f-4b4f-966f-01374710bc92 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 685.420102] env[60400]: DEBUG oslo_vmware.api [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Waiting for the task: (returnval){ [ 685.420102] env[60400]: value = "task-449787" [ 685.420102] env[60400]: _type = "Task" [ 685.420102] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 685.428177] env[60400]: DEBUG oslo_vmware.api [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Task: {'id': task-449787, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 685.931872] env[60400]: DEBUG oslo_vmware.api [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Task: {'id': task-449787, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072439} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 685.932447] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Deleted the datastore file {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 685.932943] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Deleted contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 685.935164] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Instance destroyed {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 685.935164] env[60400]: INFO nova.compute.manager [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Took 4.06 seconds to destroy the instance on the hypervisor. [ 685.938059] env[60400]: DEBUG nova.compute.claims [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Aborting claim: {{(pid=60400) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 685.938379] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 685.938722] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 686.138754] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-771ee3d2-06a8-4e38-8c35-f8de0a979485 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 686.147681] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b119c88-0e6a-43af-941f-2a1d3c89cfc6 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 686.194080] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04a7bdb6-5e55-41ec-97a2-a287f5e01331 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 686.203892] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b5e4246-d255-4508-8f9e-02cf2aea0099 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 686.218619] env[60400]: DEBUG nova.compute.provider_tree [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 686.230476] env[60400]: DEBUG nova.scheduler.client.report [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 686.260784] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.322s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 686.264170] env[60400]: ERROR nova.compute.manager [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 686.264170] env[60400]: Faults: ['InvalidArgument'] [ 686.264170] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Traceback (most recent call last): [ 686.264170] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 686.264170] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] self.driver.spawn(context, instance, image_meta, [ 686.264170] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 686.264170] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 686.264170] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 686.264170] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] self._fetch_image_if_missing(context, vi) [ 686.264170] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 686.264170] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] image_cache(vi, tmp_image_ds_loc) [ 686.264170] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 686.264574] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] vm_util.copy_virtual_disk( [ 686.264574] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 686.264574] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] session._wait_for_task(vmdk_copy_task) [ 686.264574] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 686.264574] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] return self.wait_for_task(task_ref) [ 686.264574] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 686.264574] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] return evt.wait() [ 686.264574] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 686.264574] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] result = hub.switch() [ 686.264574] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 686.264574] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] return self.greenlet.switch() [ 686.264574] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 686.264574] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] self.f(*self.args, **self.kw) [ 686.264989] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 686.264989] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] raise exceptions.translate_fault(task_info.error) [ 686.264989] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 686.264989] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Faults: ['InvalidArgument'] [ 686.264989] env[60400]: ERROR nova.compute.manager [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] [ 686.264989] env[60400]: DEBUG nova.compute.utils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] VimFaultException {{(pid=60400) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 686.268494] env[60400]: DEBUG nova.compute.manager [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Build of instance 148f525a-f3c0-40f2-8527-9607cd5e581b was re-scheduled: A specified parameter was not correct: fileType [ 686.268494] env[60400]: Faults: ['InvalidArgument'] {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 686.268939] env[60400]: DEBUG nova.compute.manager [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Unplugging VIFs for instance {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 686.269116] env[60400]: DEBUG nova.compute.manager [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 686.269375] env[60400]: DEBUG nova.compute.manager [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Deallocating network for instance {{(pid=60400) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 686.269460] env[60400]: DEBUG nova.network.neutron [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] deallocate_for_instance() {{(pid=60400) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 687.181382] env[60400]: DEBUG nova.network.neutron [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Updating instance_info_cache with network_info: [] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 687.196285] env[60400]: INFO nova.compute.manager [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Took 0.93 seconds to deallocate network for instance. [ 687.340080] env[60400]: INFO nova.scheduler.client.report [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Deleted allocations for instance 148f525a-f3c0-40f2-8527-9607cd5e581b [ 687.367686] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Lock "148f525a-f3c0-40f2-8527-9607cd5e581b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 61.972s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 699.503978] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 699.525777] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 699.526138] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 699.541520] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 699.541520] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 699.541766] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 699.541766] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60400) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 699.542862] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc1f1c95-2424-4742-bf18-9dc818934e0d {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 699.551451] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a5d0a72-0fa0-4523-843b-b424d16a0247 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 699.565381] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81edc432-ca19-40a7-ba94-eddf6c91a8b8 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 699.571721] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8dfb693-548f-4081-8df7-65b0f8784bc3 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 699.603090] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181789MB free_disk=118GB free_vcpus=48 pci_devices=None {{(pid=60400) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 699.603090] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 699.603090] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 699.668037] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 30c40353-01fe-407d-8d56-0f6c166d12e3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 699.669193] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 130961ce-1e22-4320-abc9-30fc5f652be3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 699.669193] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 4540cd82-440c-41e3-8bfa-b384da6fc964 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 699.669193] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance a45f24ab-afe1-4ffd-a917-11b68a0b29ec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 699.669193] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 65bf8cf0-825c-42d8-bd78-62a6277d29d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 699.669452] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance cc1d534d-6a43-4575-895d-c3bef84d772e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 699.669452] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance e4f0342a-4169-40aa-b234-a2e2340d5b05 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 699.669452] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=60400) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 699.669452] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=60400) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 699.781796] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c19209e-9f7b-4f5c-a23d-0e9be420ccca {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 699.788882] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3217afcb-7001-49ce-b530-5f8cb0966024 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 699.818083] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1dafc5db-a691-4edd-b64d-522c9e96f403 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 699.825087] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88ab4598-a6ff-4ceb-8481-875d4d9a0d81 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 699.838087] env[60400]: DEBUG nova.compute.provider_tree [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 699.853553] env[60400]: DEBUG nova.scheduler.client.report [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 699.868804] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60400) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 699.868977] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.267s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 700.276137] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 700.277146] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 700.277146] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60400) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 700.928900] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 700.933710] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 700.933710] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Starting heal instance info cache {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 700.933710] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Rebuilding the list of instances to heal {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 700.957464] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 700.957464] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 700.957464] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 700.957464] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 700.957464] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 700.957777] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 700.961250] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 700.961562] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Didn't find any instances for network info cache update. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 700.962202] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 700.962512] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 700.962800] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 709.056332] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Acquiring lock "f202a181-b5ea-4b06-91ad-86356b51e088" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.059903] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Lock "f202a181-b5ea-4b06-91ad-86356b51e088" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.069192] env[60400]: DEBUG nova.compute.manager [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Starting instance... {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 709.133182] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.133470] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.135203] env[60400]: INFO nova.compute.claims [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 709.348404] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-505ced0c-137e-45e4-b267-b248f3f04fc2 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.357166] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9dcd7bb6-8dd1-41c8-a5ce-0930b0b7e3ae {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.388474] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc01657f-94a0-4385-836a-83deabf5a6f5 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.395601] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f39d1bf9-9949-48b7-88ff-52cb46c2ce8a {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.409945] env[60400]: DEBUG nova.compute.provider_tree [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 709.420229] env[60400]: DEBUG nova.scheduler.client.report [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 709.448083] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.311s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 709.448551] env[60400]: DEBUG nova.compute.manager [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Start building networks asynchronously for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 709.500940] env[60400]: DEBUG nova.compute.utils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Using /dev/sd instead of None {{(pid=60400) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 709.500940] env[60400]: DEBUG nova.compute.manager [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Allocating IP information in the background. {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 709.500940] env[60400]: DEBUG nova.network.neutron [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] allocate_for_instance() {{(pid=60400) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 709.520332] env[60400]: DEBUG nova.compute.manager [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Start building block device mappings for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 709.636590] env[60400]: DEBUG nova.compute.manager [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Start spawning the instance on the hypervisor. {{(pid=60400) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 709.661760] env[60400]: DEBUG nova.virt.hardware [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T04:32:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T04:32:17Z,direct_url=,disk_format='vmdk',id=f5dfd970-7a56-4489-873c-2c3b6fbd9fe9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c82f07917ba4819a6bcf09e15f9f9cf',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T04:32:18Z,virtual_size=,visibility=), allow threads: False {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 709.662156] env[60400]: DEBUG nova.virt.hardware [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Flavor limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 709.665445] env[60400]: DEBUG nova.virt.hardware [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Image limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 709.665445] env[60400]: DEBUG nova.virt.hardware [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Flavor pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 709.665445] env[60400]: DEBUG nova.virt.hardware [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Image pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 709.665445] env[60400]: DEBUG nova.virt.hardware [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 709.665445] env[60400]: DEBUG nova.virt.hardware [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 709.665663] env[60400]: DEBUG nova.virt.hardware [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 709.665663] env[60400]: DEBUG nova.virt.hardware [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Got 1 possible topologies {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 709.665663] env[60400]: DEBUG nova.virt.hardware [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 709.665663] env[60400]: DEBUG nova.virt.hardware [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 709.665663] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a88db9e2-c311-470c-8470-ba5747fed365 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.673754] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d73f91c5-8af5-42ae-9c21-aa332acaddc7 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.840047] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Acquiring lock "63151ec9-f383-46cc-ac57-c3f7f1569410" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.842156] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Lock "63151ec9-f383-46cc-ac57-c3f7f1569410" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.852342] env[60400]: DEBUG nova.compute.manager [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Starting instance... {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 709.918292] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.920685] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.920685] env[60400]: INFO nova.compute.claims [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 710.056461] env[60400]: DEBUG nova.policy [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '53a288e38bec4962997348279606f1a0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'abdf2b93a0a241ae9fa1b395f41da87e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60400) authorize /opt/stack/nova/nova/policy.py:203}} [ 710.133653] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-262172da-df21-466c-9583-91efb623ef67 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.143378] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa72df28-9866-495d-bf1f-c7c603e29b23 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.175313] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7845586-1b90-4302-964a-64c44b720174 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.183441] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e99f521e-a152-4574-b3df-7883903b26c7 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.199716] env[60400]: DEBUG nova.compute.provider_tree [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 710.209020] env[60400]: DEBUG nova.scheduler.client.report [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 710.227410] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.309s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 710.227910] env[60400]: DEBUG nova.compute.manager [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Start building networks asynchronously for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 710.271786] env[60400]: DEBUG nova.compute.utils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Using /dev/sd instead of None {{(pid=60400) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 710.273319] env[60400]: DEBUG nova.compute.manager [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Allocating IP information in the background. {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 710.273532] env[60400]: DEBUG nova.network.neutron [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] allocate_for_instance() {{(pid=60400) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 710.283733] env[60400]: DEBUG nova.compute.manager [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Start building block device mappings for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 710.383911] env[60400]: DEBUG nova.compute.manager [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Start spawning the instance on the hypervisor. {{(pid=60400) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 710.413853] env[60400]: DEBUG nova.virt.hardware [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T04:32:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T04:32:17Z,direct_url=,disk_format='vmdk',id=f5dfd970-7a56-4489-873c-2c3b6fbd9fe9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c82f07917ba4819a6bcf09e15f9f9cf',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T04:32:18Z,virtual_size=,visibility=), allow threads: False {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 710.413853] env[60400]: DEBUG nova.virt.hardware [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Flavor limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 710.413853] env[60400]: DEBUG nova.virt.hardware [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Image limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 710.414048] env[60400]: DEBUG nova.virt.hardware [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Flavor pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 710.414207] env[60400]: DEBUG nova.virt.hardware [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Image pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 710.414260] env[60400]: DEBUG nova.virt.hardware [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 710.414478] env[60400]: DEBUG nova.virt.hardware [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 710.414640] env[60400]: DEBUG nova.virt.hardware [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 710.416023] env[60400]: DEBUG nova.virt.hardware [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Got 1 possible topologies {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 710.416023] env[60400]: DEBUG nova.virt.hardware [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 710.416023] env[60400]: DEBUG nova.virt.hardware [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 710.416023] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61ede534-4589-41eb-a191-231b5d2379ea {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.425925] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd29cb68-d37b-4e10-8feb-2c3c432a675e {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.457665] env[60400]: DEBUG nova.policy [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '97786b94b75a41b8bbc3db750aa7b8d2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fc9e7dc1863d455c98d44991ab5be2bc', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60400) authorize /opt/stack/nova/nova/policy.py:203}} [ 711.640858] env[60400]: DEBUG nova.network.neutron [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Successfully created port: 7f9ba268-7959-4d15-8d3d-f7a6c33a0287 {{(pid=60400) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 711.774882] env[60400]: DEBUG nova.network.neutron [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Successfully created port: fb944cf8-2052-40f8-ac94-fa2beac376d5 {{(pid=60400) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 713.823137] env[60400]: DEBUG nova.network.neutron [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Successfully updated port: 7f9ba268-7959-4d15-8d3d-f7a6c33a0287 {{(pid=60400) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 713.836542] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Acquiring lock "refresh_cache-63151ec9-f383-46cc-ac57-c3f7f1569410" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 713.836542] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Acquired lock "refresh_cache-63151ec9-f383-46cc-ac57-c3f7f1569410" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 713.836542] env[60400]: DEBUG nova.network.neutron [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Building network info cache for instance {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} [ 713.942077] env[60400]: DEBUG nova.network.neutron [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Instance cache missing network info. {{(pid=60400) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 714.302782] env[60400]: DEBUG nova.network.neutron [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Successfully updated port: fb944cf8-2052-40f8-ac94-fa2beac376d5 {{(pid=60400) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 714.318919] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Acquiring lock "refresh_cache-f202a181-b5ea-4b06-91ad-86356b51e088" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 714.319075] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Acquired lock "refresh_cache-f202a181-b5ea-4b06-91ad-86356b51e088" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 714.319974] env[60400]: DEBUG nova.network.neutron [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Building network info cache for instance {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} [ 714.448322] env[60400]: DEBUG nova.network.neutron [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Instance cache missing network info. {{(pid=60400) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 714.557527] env[60400]: DEBUG nova.network.neutron [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Updating instance_info_cache with network_info: [{"id": "7f9ba268-7959-4d15-8d3d-f7a6c33a0287", "address": "fa:16:3e:e1:35:94", "network": {"id": "8c7c4ae7-341a-4975-87f3-c5580b80ce1b", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-21103876-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "fc9e7dc1863d455c98d44991ab5be2bc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2d88bb07-f93c-45ca-bce7-230cb1f33833", "external-id": "nsx-vlan-transportzone-387", "segmentation_id": 387, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7f9ba268-79", "ovs_interfaceid": "7f9ba268-7959-4d15-8d3d-f7a6c33a0287", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 714.576017] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Releasing lock "refresh_cache-63151ec9-f383-46cc-ac57-c3f7f1569410" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 714.576017] env[60400]: DEBUG nova.compute.manager [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Instance network_info: |[{"id": "7f9ba268-7959-4d15-8d3d-f7a6c33a0287", "address": "fa:16:3e:e1:35:94", "network": {"id": "8c7c4ae7-341a-4975-87f3-c5580b80ce1b", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-21103876-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "fc9e7dc1863d455c98d44991ab5be2bc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2d88bb07-f93c-45ca-bce7-230cb1f33833", "external-id": "nsx-vlan-transportzone-387", "segmentation_id": 387, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7f9ba268-79", "ovs_interfaceid": "7f9ba268-7959-4d15-8d3d-f7a6c33a0287", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 714.576377] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e1:35:94', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2d88bb07-f93c-45ca-bce7-230cb1f33833', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '7f9ba268-7959-4d15-8d3d-f7a6c33a0287', 'vif_model': 'vmxnet3'}] {{(pid=60400) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 714.586871] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Creating folder: Project (fc9e7dc1863d455c98d44991ab5be2bc). Parent ref: group-v119075. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 714.588675] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6e478afa-546e-4c45-bfd9-b77f7e78689e {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.600417] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Created folder: Project (fc9e7dc1863d455c98d44991ab5be2bc) in parent group-v119075. [ 714.600417] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Creating folder: Instances. Parent ref: group-v119104. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 714.600417] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-527d06e4-ebef-4525-8598-266b15c42168 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.610028] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Created folder: Instances in parent group-v119104. [ 714.610266] env[60400]: DEBUG oslo.service.loopingcall [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60400) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 714.610457] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Creating VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 714.610660] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0275218a-d374-4b68-891a-bc9e60a36ba3 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.634549] env[60400]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 714.634549] env[60400]: value = "task-449791" [ 714.634549] env[60400]: _type = "Task" [ 714.634549] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 714.643747] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449791, 'name': CreateVM_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 714.997505] env[60400]: DEBUG nova.network.neutron [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Updating instance_info_cache with network_info: [{"id": "fb944cf8-2052-40f8-ac94-fa2beac376d5", "address": "fa:16:3e:9b:bf:74", "network": {"id": "3a999734-2ed8-4112-9542-21b93f70061b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1764458105-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "abdf2b93a0a241ae9fa1b395f41da87e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b356db78-99c7-4464-822c-fc7e193f7878", "external-id": "nsx-vlan-transportzone-231", "segmentation_id": 231, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfb944cf8-20", "ovs_interfaceid": "fb944cf8-2052-40f8-ac94-fa2beac376d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 715.005840] env[60400]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Acquiring lock "c5b391a9-7969-4119-9bc6-b0e1fe7a9713" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 715.006076] env[60400]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Lock "c5b391a9-7969-4119-9bc6-b0e1fe7a9713" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 715.013279] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Releasing lock "refresh_cache-f202a181-b5ea-4b06-91ad-86356b51e088" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 715.013587] env[60400]: DEBUG nova.compute.manager [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Instance network_info: |[{"id": "fb944cf8-2052-40f8-ac94-fa2beac376d5", "address": "fa:16:3e:9b:bf:74", "network": {"id": "3a999734-2ed8-4112-9542-21b93f70061b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1764458105-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "abdf2b93a0a241ae9fa1b395f41da87e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b356db78-99c7-4464-822c-fc7e193f7878", "external-id": "nsx-vlan-transportzone-231", "segmentation_id": 231, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfb944cf8-20", "ovs_interfaceid": "fb944cf8-2052-40f8-ac94-fa2beac376d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 715.014291] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:9b:bf:74', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'b356db78-99c7-4464-822c-fc7e193f7878', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'fb944cf8-2052-40f8-ac94-fa2beac376d5', 'vif_model': 'vmxnet3'}] {{(pid=60400) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 715.023800] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Creating folder: Project (abdf2b93a0a241ae9fa1b395f41da87e). Parent ref: group-v119075. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 715.026345] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ba2ec4d8-7f95-4566-a532-716b51f5c3db {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.029061] env[60400]: DEBUG nova.compute.manager [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Starting instance... {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 715.040340] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Created folder: Project (abdf2b93a0a241ae9fa1b395f41da87e) in parent group-v119075. [ 715.040537] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Creating folder: Instances. Parent ref: group-v119107. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 715.040967] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-81da1e3b-4948-4a36-9764-ff7e5a2100a5 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.052889] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Created folder: Instances in parent group-v119107. [ 715.052889] env[60400]: DEBUG oslo.service.loopingcall [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60400) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 715.052985] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Creating VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 715.053107] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-145f5fe7-e235-4dc3-9a98-6b56590c8b97 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.083976] env[60400]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 715.083976] env[60400]: value = "task-449794" [ 715.083976] env[60400]: _type = "Task" [ 715.083976] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 715.095312] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449794, 'name': CreateVM_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 715.118038] env[60400]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 715.118309] env[60400]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 715.120326] env[60400]: INFO nova.compute.claims [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 715.139098] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Acquiring lock "95f71b47-73c8-4a82-b806-f6f2ed9cdbb3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 715.142425] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Lock "95f71b47-73c8-4a82-b806-f6f2ed9cdbb3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 715.151293] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449791, 'name': CreateVM_Task, 'duration_secs': 0.316864} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 715.151510] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Created VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 715.152300] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 715.152511] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 715.152901] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 715.153196] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1a7b5fd1-9075-4156-bd5f-530219c9911a {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.158020] env[60400]: DEBUG oslo_vmware.api [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Waiting for the task: (returnval){ [ 715.158020] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52052888-80a2-3538-8604-26fb8fce007e" [ 715.158020] env[60400]: _type = "Task" [ 715.158020] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 715.167221] env[60400]: DEBUG oslo_vmware.api [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52052888-80a2-3538-8604-26fb8fce007e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 715.367942] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c9514e3-61b5-4d0a-8a04-dfb622611db2 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.375989] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c158c43a-e23a-4b7d-a24e-ba501c1c23fa {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.406323] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30fb5729-667a-4c38-840b-c8d39a40082e {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.414562] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61822ff2-f238-4fea-9c20-327fd85466e8 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.427947] env[60400]: DEBUG nova.compute.provider_tree [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 715.437028] env[60400]: DEBUG nova.scheduler.client.report [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 715.452077] env[60400]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.334s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 715.452753] env[60400]: DEBUG nova.compute.manager [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Start building networks asynchronously for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 715.490089] env[60400]: DEBUG nova.compute.utils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Using /dev/sd instead of None {{(pid=60400) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 715.491324] env[60400]: DEBUG nova.compute.manager [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Allocating IP information in the background. {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 715.491518] env[60400]: DEBUG nova.network.neutron [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] allocate_for_instance() {{(pid=60400) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 715.501815] env[60400]: DEBUG nova.compute.manager [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Start building block device mappings for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 715.573584] env[60400]: DEBUG nova.compute.manager [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Start spawning the instance on the hypervisor. {{(pid=60400) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 715.596192] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449794, 'name': CreateVM_Task, 'duration_secs': 0.283936} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 715.598011] env[60400]: DEBUG nova.virt.hardware [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T04:32:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T04:32:17Z,direct_url=,disk_format='vmdk',id=f5dfd970-7a56-4489-873c-2c3b6fbd9fe9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c82f07917ba4819a6bcf09e15f9f9cf',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T04:32:18Z,virtual_size=,visibility=), allow threads: False {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 715.598230] env[60400]: DEBUG nova.virt.hardware [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Flavor limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 715.598379] env[60400]: DEBUG nova.virt.hardware [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Image limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 715.598549] env[60400]: DEBUG nova.virt.hardware [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Flavor pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 715.598687] env[60400]: DEBUG nova.virt.hardware [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Image pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 715.598837] env[60400]: DEBUG nova.virt.hardware [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 715.599043] env[60400]: DEBUG nova.virt.hardware [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 715.599194] env[60400]: DEBUG nova.virt.hardware [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 715.599528] env[60400]: DEBUG nova.virt.hardware [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Got 1 possible topologies {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 715.599528] env[60400]: DEBUG nova.virt.hardware [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 715.599796] env[60400]: DEBUG nova.virt.hardware [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 715.600373] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Created VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 715.600815] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8afd95ad-8a2c-4928-9d82-9dfac3da38f0 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.604964] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 715.612948] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c97ffce-2199-4ae4-8f07-700a135467d6 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.648780] env[60400]: DEBUG nova.policy [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9579a37d71414dae93da5b1490e44c86', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5981d0de9a5545a4b2db5ab222672012', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60400) authorize /opt/stack/nova/nova/policy.py:203}} [ 715.668369] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 715.668641] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Processing image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 715.668873] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 715.669122] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 715.669422] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 715.669684] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c45f4a08-6ec7-4f55-8038-13b944817238 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.674452] env[60400]: DEBUG oslo_vmware.api [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Waiting for the task: (returnval){ [ 715.674452] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]5288e7c0-b04c-b2af-82bd-45c84873e43b" [ 715.674452] env[60400]: _type = "Task" [ 715.674452] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 715.682703] env[60400]: DEBUG oslo_vmware.api [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]5288e7c0-b04c-b2af-82bd-45c84873e43b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 716.192618] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 716.192873] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Processing image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 716.193110] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 716.455448] env[60400]: DEBUG nova.compute.manager [req-31ce7b74-f199-439a-a4eb-e068db1c023f req-f7ffa3f3-cd68-4aef-be5b-c02b4216c1e7 service nova] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Received event network-vif-plugged-fb944cf8-2052-40f8-ac94-fa2beac376d5 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 716.455665] env[60400]: DEBUG oslo_concurrency.lockutils [req-31ce7b74-f199-439a-a4eb-e068db1c023f req-f7ffa3f3-cd68-4aef-be5b-c02b4216c1e7 service nova] Acquiring lock "f202a181-b5ea-4b06-91ad-86356b51e088-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.455870] env[60400]: DEBUG oslo_concurrency.lockutils [req-31ce7b74-f199-439a-a4eb-e068db1c023f req-f7ffa3f3-cd68-4aef-be5b-c02b4216c1e7 service nova] Lock "f202a181-b5ea-4b06-91ad-86356b51e088-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.456477] env[60400]: DEBUG oslo_concurrency.lockutils [req-31ce7b74-f199-439a-a4eb-e068db1c023f req-f7ffa3f3-cd68-4aef-be5b-c02b4216c1e7 service nova] Lock "f202a181-b5ea-4b06-91ad-86356b51e088-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 716.456718] env[60400]: DEBUG nova.compute.manager [req-31ce7b74-f199-439a-a4eb-e068db1c023f req-f7ffa3f3-cd68-4aef-be5b-c02b4216c1e7 service nova] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] No waiting events found dispatching network-vif-plugged-fb944cf8-2052-40f8-ac94-fa2beac376d5 {{(pid=60400) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 716.456893] env[60400]: WARNING nova.compute.manager [req-31ce7b74-f199-439a-a4eb-e068db1c023f req-f7ffa3f3-cd68-4aef-be5b-c02b4216c1e7 service nova] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Received unexpected event network-vif-plugged-fb944cf8-2052-40f8-ac94-fa2beac376d5 for instance with vm_state building and task_state spawning. [ 716.503059] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Acquiring lock "7476fb96-5247-472c-ab92-ef7e5916cb00" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.503282] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Lock "7476fb96-5247-472c-ab92-ef7e5916cb00" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.541575] env[60400]: DEBUG nova.compute.manager [req-b7d67ecc-f5ca-406a-a9be-2538ca06e4a2 req-04302a04-afea-41ab-b637-76e86060ee44 service nova] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Received event network-vif-plugged-7f9ba268-7959-4d15-8d3d-f7a6c33a0287 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 716.541793] env[60400]: DEBUG oslo_concurrency.lockutils [req-b7d67ecc-f5ca-406a-a9be-2538ca06e4a2 req-04302a04-afea-41ab-b637-76e86060ee44 service nova] Acquiring lock "63151ec9-f383-46cc-ac57-c3f7f1569410-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.541987] env[60400]: DEBUG oslo_concurrency.lockutils [req-b7d67ecc-f5ca-406a-a9be-2538ca06e4a2 req-04302a04-afea-41ab-b637-76e86060ee44 service nova] Lock "63151ec9-f383-46cc-ac57-c3f7f1569410-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.542157] env[60400]: DEBUG oslo_concurrency.lockutils [req-b7d67ecc-f5ca-406a-a9be-2538ca06e4a2 req-04302a04-afea-41ab-b637-76e86060ee44 service nova] Lock "63151ec9-f383-46cc-ac57-c3f7f1569410-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 716.542309] env[60400]: DEBUG nova.compute.manager [req-b7d67ecc-f5ca-406a-a9be-2538ca06e4a2 req-04302a04-afea-41ab-b637-76e86060ee44 service nova] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] No waiting events found dispatching network-vif-plugged-7f9ba268-7959-4d15-8d3d-f7a6c33a0287 {{(pid=60400) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 716.542462] env[60400]: WARNING nova.compute.manager [req-b7d67ecc-f5ca-406a-a9be-2538ca06e4a2 req-04302a04-afea-41ab-b637-76e86060ee44 service nova] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Received unexpected event network-vif-plugged-7f9ba268-7959-4d15-8d3d-f7a6c33a0287 for instance with vm_state building and task_state spawning. [ 716.988416] env[60400]: DEBUG nova.network.neutron [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Successfully created port: 6cf68b2a-cbef-4cdf-9893-d28ee3add61e {{(pid=60400) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 718.712354] env[60400]: DEBUG nova.network.neutron [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Successfully updated port: 6cf68b2a-cbef-4cdf-9893-d28ee3add61e {{(pid=60400) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 718.729727] env[60400]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Acquiring lock "refresh_cache-c5b391a9-7969-4119-9bc6-b0e1fe7a9713" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 718.730577] env[60400]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Acquired lock "refresh_cache-c5b391a9-7969-4119-9bc6-b0e1fe7a9713" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 718.732200] env[60400]: DEBUG nova.network.neutron [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Building network info cache for instance {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} [ 718.822304] env[60400]: DEBUG nova.network.neutron [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Instance cache missing network info. {{(pid=60400) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 718.985974] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Acquiring lock "35630c7b-fdf4-4d6d-8e5a-0045f1387f93" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 718.985974] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Lock "35630c7b-fdf4-4d6d-8e5a-0045f1387f93" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 719.001185] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Acquiring lock "837197c0-9ff8-45a2-8bf0-730158a43a17" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 719.001412] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Lock "837197c0-9ff8-45a2-8bf0-730158a43a17" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 719.106050] env[60400]: DEBUG nova.network.neutron [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Updating instance_info_cache with network_info: [{"id": "6cf68b2a-cbef-4cdf-9893-d28ee3add61e", "address": "fa:16:3e:58:06:b6", "network": {"id": "7e85f6fb-fd54-4bab-bb8b-03128ff2fa2d", "bridge": "br-int", "label": "tempest-ServersTestJSON-248675440-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5981d0de9a5545a4b2db5ab222672012", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6f493cd8-1cb4-42a1-8d56-bfa6ac7cf563", "external-id": "nsx-vlan-transportzone-931", "segmentation_id": 931, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6cf68b2a-cb", "ovs_interfaceid": "6cf68b2a-cbef-4cdf-9893-d28ee3add61e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 719.118134] env[60400]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Releasing lock "refresh_cache-c5b391a9-7969-4119-9bc6-b0e1fe7a9713" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 719.118134] env[60400]: DEBUG nova.compute.manager [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Instance network_info: |[{"id": "6cf68b2a-cbef-4cdf-9893-d28ee3add61e", "address": "fa:16:3e:58:06:b6", "network": {"id": "7e85f6fb-fd54-4bab-bb8b-03128ff2fa2d", "bridge": "br-int", "label": "tempest-ServersTestJSON-248675440-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5981d0de9a5545a4b2db5ab222672012", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6f493cd8-1cb4-42a1-8d56-bfa6ac7cf563", "external-id": "nsx-vlan-transportzone-931", "segmentation_id": 931, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6cf68b2a-cb", "ovs_interfaceid": "6cf68b2a-cbef-4cdf-9893-d28ee3add61e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 719.118635] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:58:06:b6', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6f493cd8-1cb4-42a1-8d56-bfa6ac7cf563', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6cf68b2a-cbef-4cdf-9893-d28ee3add61e', 'vif_model': 'vmxnet3'}] {{(pid=60400) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 719.128403] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Creating folder: Project (5981d0de9a5545a4b2db5ab222672012). Parent ref: group-v119075. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 719.132054] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-02937d30-26c6-41f1-ac41-8155726a76e0 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.142774] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Created folder: Project (5981d0de9a5545a4b2db5ab222672012) in parent group-v119075. [ 719.143034] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Creating folder: Instances. Parent ref: group-v119110. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 719.143272] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6381dcdb-d77a-423d-951d-662d2329f33c {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.152105] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Created folder: Instances in parent group-v119110. [ 719.152383] env[60400]: DEBUG oslo.service.loopingcall [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60400) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 719.152564] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Creating VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 719.152802] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0557192b-d39e-4ef8-9ca7-d9e2bc19a739 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.178882] env[60400]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 719.178882] env[60400]: value = "task-449797" [ 719.178882] env[60400]: _type = "Task" [ 719.178882] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 719.191865] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449797, 'name': CreateVM_Task} progress is 5%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 719.640185] env[60400]: DEBUG nova.compute.manager [req-bc4ed3cb-a064-44a1-ae51-431caa887c12 req-3b313539-dd2e-4650-98f8-880c68b0d63d service nova] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Received event network-changed-fb944cf8-2052-40f8-ac94-fa2beac376d5 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 719.640556] env[60400]: DEBUG nova.compute.manager [req-bc4ed3cb-a064-44a1-ae51-431caa887c12 req-3b313539-dd2e-4650-98f8-880c68b0d63d service nova] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Refreshing instance network info cache due to event network-changed-fb944cf8-2052-40f8-ac94-fa2beac376d5. {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 719.640918] env[60400]: DEBUG oslo_concurrency.lockutils [req-bc4ed3cb-a064-44a1-ae51-431caa887c12 req-3b313539-dd2e-4650-98f8-880c68b0d63d service nova] Acquiring lock "refresh_cache-f202a181-b5ea-4b06-91ad-86356b51e088" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 719.642492] env[60400]: DEBUG oslo_concurrency.lockutils [req-bc4ed3cb-a064-44a1-ae51-431caa887c12 req-3b313539-dd2e-4650-98f8-880c68b0d63d service nova] Acquired lock "refresh_cache-f202a181-b5ea-4b06-91ad-86356b51e088" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 719.642492] env[60400]: DEBUG nova.network.neutron [req-bc4ed3cb-a064-44a1-ae51-431caa887c12 req-3b313539-dd2e-4650-98f8-880c68b0d63d service nova] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Refreshing network info cache for port fb944cf8-2052-40f8-ac94-fa2beac376d5 {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} [ 719.663463] env[60400]: DEBUG nova.compute.manager [req-857bc6ce-b0c7-4ff5-975e-c387e10b578d req-91133dc7-da1f-47ae-8ed7-8aa0439a6a9f service nova] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Received event network-changed-7f9ba268-7959-4d15-8d3d-f7a6c33a0287 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 719.663712] env[60400]: DEBUG nova.compute.manager [req-857bc6ce-b0c7-4ff5-975e-c387e10b578d req-91133dc7-da1f-47ae-8ed7-8aa0439a6a9f service nova] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Refreshing instance network info cache due to event network-changed-7f9ba268-7959-4d15-8d3d-f7a6c33a0287. {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 719.663982] env[60400]: DEBUG oslo_concurrency.lockutils [req-857bc6ce-b0c7-4ff5-975e-c387e10b578d req-91133dc7-da1f-47ae-8ed7-8aa0439a6a9f service nova] Acquiring lock "refresh_cache-63151ec9-f383-46cc-ac57-c3f7f1569410" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 719.664193] env[60400]: DEBUG oslo_concurrency.lockutils [req-857bc6ce-b0c7-4ff5-975e-c387e10b578d req-91133dc7-da1f-47ae-8ed7-8aa0439a6a9f service nova] Acquired lock "refresh_cache-63151ec9-f383-46cc-ac57-c3f7f1569410" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 719.664404] env[60400]: DEBUG nova.network.neutron [req-857bc6ce-b0c7-4ff5-975e-c387e10b578d req-91133dc7-da1f-47ae-8ed7-8aa0439a6a9f service nova] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Refreshing network info cache for port 7f9ba268-7959-4d15-8d3d-f7a6c33a0287 {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} [ 719.690758] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449797, 'name': CreateVM_Task, 'duration_secs': 0.290772} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 719.691239] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Created VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 719.691968] env[60400]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 719.692274] env[60400]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 719.692677] env[60400]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 719.693058] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-087550c3-9b85-455d-8fb7-6039cc67381a {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.698677] env[60400]: DEBUG oslo_vmware.api [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Waiting for the task: (returnval){ [ 719.698677] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]529847f3-ee14-f681-7824-cdc456004a41" [ 719.698677] env[60400]: _type = "Task" [ 719.698677] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 719.712738] env[60400]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 719.712738] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Processing image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 719.712738] env[60400]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 720.079504] env[60400]: DEBUG nova.network.neutron [req-857bc6ce-b0c7-4ff5-975e-c387e10b578d req-91133dc7-da1f-47ae-8ed7-8aa0439a6a9f service nova] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Updated VIF entry in instance network info cache for port 7f9ba268-7959-4d15-8d3d-f7a6c33a0287. {{(pid=60400) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} [ 720.079999] env[60400]: DEBUG nova.network.neutron [req-857bc6ce-b0c7-4ff5-975e-c387e10b578d req-91133dc7-da1f-47ae-8ed7-8aa0439a6a9f service nova] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Updating instance_info_cache with network_info: [{"id": "7f9ba268-7959-4d15-8d3d-f7a6c33a0287", "address": "fa:16:3e:e1:35:94", "network": {"id": "8c7c4ae7-341a-4975-87f3-c5580b80ce1b", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-21103876-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "fc9e7dc1863d455c98d44991ab5be2bc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2d88bb07-f93c-45ca-bce7-230cb1f33833", "external-id": "nsx-vlan-transportzone-387", "segmentation_id": 387, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7f9ba268-79", "ovs_interfaceid": "7f9ba268-7959-4d15-8d3d-f7a6c33a0287", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 720.090980] env[60400]: DEBUG oslo_concurrency.lockutils [req-857bc6ce-b0c7-4ff5-975e-c387e10b578d req-91133dc7-da1f-47ae-8ed7-8aa0439a6a9f service nova] Releasing lock "refresh_cache-63151ec9-f383-46cc-ac57-c3f7f1569410" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 720.190504] env[60400]: DEBUG nova.network.neutron [req-bc4ed3cb-a064-44a1-ae51-431caa887c12 req-3b313539-dd2e-4650-98f8-880c68b0d63d service nova] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Updated VIF entry in instance network info cache for port fb944cf8-2052-40f8-ac94-fa2beac376d5. {{(pid=60400) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} [ 720.191189] env[60400]: DEBUG nova.network.neutron [req-bc4ed3cb-a064-44a1-ae51-431caa887c12 req-3b313539-dd2e-4650-98f8-880c68b0d63d service nova] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Updating instance_info_cache with network_info: [{"id": "fb944cf8-2052-40f8-ac94-fa2beac376d5", "address": "fa:16:3e:9b:bf:74", "network": {"id": "3a999734-2ed8-4112-9542-21b93f70061b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1764458105-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "abdf2b93a0a241ae9fa1b395f41da87e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b356db78-99c7-4464-822c-fc7e193f7878", "external-id": "nsx-vlan-transportzone-231", "segmentation_id": 231, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfb944cf8-20", "ovs_interfaceid": "fb944cf8-2052-40f8-ac94-fa2beac376d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 720.203202] env[60400]: DEBUG oslo_concurrency.lockutils [req-bc4ed3cb-a064-44a1-ae51-431caa887c12 req-3b313539-dd2e-4650-98f8-880c68b0d63d service nova] Releasing lock "refresh_cache-f202a181-b5ea-4b06-91ad-86356b51e088" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 721.210345] env[60400]: DEBUG oslo_concurrency.lockutils [None req-94fe6431-1bb6-476e-9e72-7a5409337850 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Acquiring lock "56471a78-08cd-4d1a-b3f5-d1eac277183e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 721.210627] env[60400]: DEBUG oslo_concurrency.lockutils [None req-94fe6431-1bb6-476e-9e72-7a5409337850 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "56471a78-08cd-4d1a-b3f5-d1eac277183e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 721.916540] env[60400]: DEBUG nova.compute.manager [req-796243a3-d140-4468-9325-ce10c5ef91ce req-a2596d50-5dce-4828-a0a1-983a83a9c208 service nova] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Received event network-vif-plugged-6cf68b2a-cbef-4cdf-9893-d28ee3add61e {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 721.916751] env[60400]: DEBUG oslo_concurrency.lockutils [req-796243a3-d140-4468-9325-ce10c5ef91ce req-a2596d50-5dce-4828-a0a1-983a83a9c208 service nova] Acquiring lock "c5b391a9-7969-4119-9bc6-b0e1fe7a9713-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 721.916944] env[60400]: DEBUG oslo_concurrency.lockutils [req-796243a3-d140-4468-9325-ce10c5ef91ce req-a2596d50-5dce-4828-a0a1-983a83a9c208 service nova] Lock "c5b391a9-7969-4119-9bc6-b0e1fe7a9713-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 721.917119] env[60400]: DEBUG oslo_concurrency.lockutils [req-796243a3-d140-4468-9325-ce10c5ef91ce req-a2596d50-5dce-4828-a0a1-983a83a9c208 service nova] Lock "c5b391a9-7969-4119-9bc6-b0e1fe7a9713-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 721.917276] env[60400]: DEBUG nova.compute.manager [req-796243a3-d140-4468-9325-ce10c5ef91ce req-a2596d50-5dce-4828-a0a1-983a83a9c208 service nova] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] No waiting events found dispatching network-vif-plugged-6cf68b2a-cbef-4cdf-9893-d28ee3add61e {{(pid=60400) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 721.917520] env[60400]: WARNING nova.compute.manager [req-796243a3-d140-4468-9325-ce10c5ef91ce req-a2596d50-5dce-4828-a0a1-983a83a9c208 service nova] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Received unexpected event network-vif-plugged-6cf68b2a-cbef-4cdf-9893-d28ee3add61e for instance with vm_state building and task_state spawning. [ 721.917698] env[60400]: DEBUG nova.compute.manager [req-796243a3-d140-4468-9325-ce10c5ef91ce req-a2596d50-5dce-4828-a0a1-983a83a9c208 service nova] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Received event network-changed-6cf68b2a-cbef-4cdf-9893-d28ee3add61e {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 721.917850] env[60400]: DEBUG nova.compute.manager [req-796243a3-d140-4468-9325-ce10c5ef91ce req-a2596d50-5dce-4828-a0a1-983a83a9c208 service nova] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Refreshing instance network info cache due to event network-changed-6cf68b2a-cbef-4cdf-9893-d28ee3add61e. {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 721.918038] env[60400]: DEBUG oslo_concurrency.lockutils [req-796243a3-d140-4468-9325-ce10c5ef91ce req-a2596d50-5dce-4828-a0a1-983a83a9c208 service nova] Acquiring lock "refresh_cache-c5b391a9-7969-4119-9bc6-b0e1fe7a9713" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 721.918168] env[60400]: DEBUG oslo_concurrency.lockutils [req-796243a3-d140-4468-9325-ce10c5ef91ce req-a2596d50-5dce-4828-a0a1-983a83a9c208 service nova] Acquired lock "refresh_cache-c5b391a9-7969-4119-9bc6-b0e1fe7a9713" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 721.918318] env[60400]: DEBUG nova.network.neutron [req-796243a3-d140-4468-9325-ce10c5ef91ce req-a2596d50-5dce-4828-a0a1-983a83a9c208 service nova] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Refreshing network info cache for port 6cf68b2a-cbef-4cdf-9893-d28ee3add61e {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} [ 722.580310] env[60400]: DEBUG nova.network.neutron [req-796243a3-d140-4468-9325-ce10c5ef91ce req-a2596d50-5dce-4828-a0a1-983a83a9c208 service nova] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Updated VIF entry in instance network info cache for port 6cf68b2a-cbef-4cdf-9893-d28ee3add61e. {{(pid=60400) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} [ 722.580574] env[60400]: DEBUG nova.network.neutron [req-796243a3-d140-4468-9325-ce10c5ef91ce req-a2596d50-5dce-4828-a0a1-983a83a9c208 service nova] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Updating instance_info_cache with network_info: [{"id": "6cf68b2a-cbef-4cdf-9893-d28ee3add61e", "address": "fa:16:3e:58:06:b6", "network": {"id": "7e85f6fb-fd54-4bab-bb8b-03128ff2fa2d", "bridge": "br-int", "label": "tempest-ServersTestJSON-248675440-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5981d0de9a5545a4b2db5ab222672012", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6f493cd8-1cb4-42a1-8d56-bfa6ac7cf563", "external-id": "nsx-vlan-transportzone-931", "segmentation_id": 931, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6cf68b2a-cb", "ovs_interfaceid": "6cf68b2a-cbef-4cdf-9893-d28ee3add61e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 722.590664] env[60400]: DEBUG oslo_concurrency.lockutils [req-796243a3-d140-4468-9325-ce10c5ef91ce req-a2596d50-5dce-4828-a0a1-983a83a9c208 service nova] Releasing lock "refresh_cache-c5b391a9-7969-4119-9bc6-b0e1fe7a9713" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 725.674076] env[60400]: DEBUG oslo_concurrency.lockutils [None req-d74c6efc-892d-47d0-bc6e-18c31d9337a2 tempest-ServerDiagnosticsV248Test-746765673 tempest-ServerDiagnosticsV248Test-746765673-project-member] Acquiring lock "1240824e-c5f1-4517-b182-20245311c687" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 725.674423] env[60400]: DEBUG oslo_concurrency.lockutils [None req-d74c6efc-892d-47d0-bc6e-18c31d9337a2 tempest-ServerDiagnosticsV248Test-746765673 tempest-ServerDiagnosticsV248Test-746765673-project-member] Lock "1240824e-c5f1-4517-b182-20245311c687" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 728.526440] env[60400]: WARNING oslo_vmware.rw_handles [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 728.526440] env[60400]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 728.526440] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 728.526440] env[60400]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 728.526440] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 728.526440] env[60400]: ERROR oslo_vmware.rw_handles response.begin() [ 728.526440] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 728.526440] env[60400]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 728.526440] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 728.526440] env[60400]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 728.526440] env[60400]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 728.526440] env[60400]: ERROR oslo_vmware.rw_handles [ 728.526997] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Downloaded image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to vmware_temp/18aa9928-b05c-45e1-880a-757e1d0859cf/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 728.528487] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Caching image {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 728.528718] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Copying Virtual Disk [datastore1] vmware_temp/18aa9928-b05c-45e1-880a-757e1d0859cf/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk to [datastore1] vmware_temp/18aa9928-b05c-45e1-880a-757e1d0859cf/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk {{(pid=60400) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 728.528989] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-9f241999-ac53-4faa-91e4-d345b631d138 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.537365] env[60400]: DEBUG oslo_vmware.api [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Waiting for the task: (returnval){ [ 728.537365] env[60400]: value = "task-449798" [ 728.537365] env[60400]: _type = "Task" [ 728.537365] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 728.546151] env[60400]: DEBUG oslo_vmware.api [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Task: {'id': task-449798, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 729.054701] env[60400]: DEBUG oslo_vmware.exceptions [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Fault InvalidArgument not matched. {{(pid=60400) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 729.056875] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 729.056875] env[60400]: ERROR nova.compute.manager [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 729.056875] env[60400]: Faults: ['InvalidArgument'] [ 729.056875] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Traceback (most recent call last): [ 729.056875] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 729.056875] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] yield resources [ 729.056875] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 729.056875] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] self.driver.spawn(context, instance, image_meta, [ 729.056875] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 729.056875] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 729.057767] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 729.057767] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] self._fetch_image_if_missing(context, vi) [ 729.057767] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 729.057767] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] image_cache(vi, tmp_image_ds_loc) [ 729.057767] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 729.057767] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] vm_util.copy_virtual_disk( [ 729.057767] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 729.057767] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] session._wait_for_task(vmdk_copy_task) [ 729.057767] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 729.057767] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] return self.wait_for_task(task_ref) [ 729.057767] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 729.057767] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] return evt.wait() [ 729.057767] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 729.058287] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] result = hub.switch() [ 729.058287] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 729.058287] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] return self.greenlet.switch() [ 729.058287] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 729.058287] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] self.f(*self.args, **self.kw) [ 729.058287] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 729.058287] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] raise exceptions.translate_fault(task_info.error) [ 729.058287] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 729.058287] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Faults: ['InvalidArgument'] [ 729.058287] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] [ 729.058287] env[60400]: INFO nova.compute.manager [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Terminating instance [ 729.059508] env[60400]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 729.059758] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 729.060051] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1f6db022-a440-47ec-9239-d0719dab6dd1 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.066490] env[60400]: DEBUG nova.compute.manager [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Start destroying the instance on the hypervisor. {{(pid=60400) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 729.066490] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Destroying instance {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 729.066490] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d577273d-2832-4262-9cce-c8d1f0c47fee {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.075311] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Unregistering the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 729.077882] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d13ae3e2-cd4c-4873-b602-8fe394bc3bd1 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.079580] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 729.079749] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60400) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 729.082147] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ad53526b-c50a-4c2e-90ce-aafa89f1076d {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.088119] env[60400]: DEBUG oslo_vmware.api [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Waiting for the task: (returnval){ [ 729.088119] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]525e469d-071c-f81e-a41b-cd6c54c3d4c8" [ 729.088119] env[60400]: _type = "Task" [ 729.088119] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 729.100380] env[60400]: DEBUG oslo_vmware.api [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]525e469d-071c-f81e-a41b-cd6c54c3d4c8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 729.151319] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Unregistered the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 729.151434] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Deleting contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 729.151605] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Deleting the datastore file [datastore1] 30c40353-01fe-407d-8d56-0f6c166d12e3 {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 729.151870] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-5c1967e7-af34-479c-a2af-53198fa14029 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.158861] env[60400]: DEBUG oslo_vmware.api [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Waiting for the task: (returnval){ [ 729.158861] env[60400]: value = "task-449800" [ 729.158861] env[60400]: _type = "Task" [ 729.158861] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 729.168580] env[60400]: DEBUG oslo_vmware.api [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Task: {'id': task-449800, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 729.599482] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Preparing fetch location {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 729.599820] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Creating directory with path [datastore1] vmware_temp/ae2f6962-75e2-4820-b721-c009efd05ad6/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 729.600073] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3eb3dea1-3fed-4c4f-9836-257793aa698b {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.611965] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Created directory with path [datastore1] vmware_temp/ae2f6962-75e2-4820-b721-c009efd05ad6/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 729.612249] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Fetch image to [datastore1] vmware_temp/ae2f6962-75e2-4820-b721-c009efd05ad6/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 729.612728] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to [datastore1] vmware_temp/ae2f6962-75e2-4820-b721-c009efd05ad6/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 729.613548] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfcb8e04-cc14-4c6e-b500-200ec386607a {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.621126] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-feef08d7-1f33-48b5-9126-aedcfaeeeaa8 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.632356] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb06a29d-241e-4f5c-93b4-36428f9b411c {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.674465] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f5f854d-d191-4567-acf8-77bb3f190e82 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.686311] env[60400]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-8ac71334-65b6-4ad4-b6c7-fb0fbd917c88 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.688310] env[60400]: DEBUG oslo_vmware.api [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Task: {'id': task-449800, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.065619} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 729.688605] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Deleted the datastore file {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 729.688825] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Deleted contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 729.689043] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Instance destroyed {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 729.689254] env[60400]: INFO nova.compute.manager [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Took 0.62 seconds to destroy the instance on the hypervisor. [ 729.691812] env[60400]: DEBUG nova.compute.claims [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Aborting claim: {{(pid=60400) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 729.692035] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 729.692283] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 729.782301] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 729.853088] env[60400]: DEBUG oslo_vmware.rw_handles [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ae2f6962-75e2-4820-b721-c009efd05ad6/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60400) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 729.912807] env[60400]: DEBUG oslo_vmware.rw_handles [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Completed reading data from the image iterator. {{(pid=60400) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 729.913154] env[60400]: DEBUG oslo_vmware.rw_handles [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ae2f6962-75e2-4820-b721-c009efd05ad6/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60400) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 730.059736] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3bc9120-811a-451d-9ffe-c31fe7bb632f {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.070135] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-728f1476-cdd9-40c7-94ac-60d94f6f7429 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.111086] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ca91b69-8f0f-41be-9aef-aa300370b504 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.119183] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e35c9559-713c-4115-9330-cf21c1d9ba5b {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.132239] env[60400]: DEBUG nova.compute.provider_tree [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 730.141280] env[60400]: DEBUG nova.scheduler.client.report [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 730.157234] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.465s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 730.158075] env[60400]: ERROR nova.compute.manager [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 730.158075] env[60400]: Faults: ['InvalidArgument'] [ 730.158075] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Traceback (most recent call last): [ 730.158075] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 730.158075] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] self.driver.spawn(context, instance, image_meta, [ 730.158075] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 730.158075] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 730.158075] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 730.158075] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] self._fetch_image_if_missing(context, vi) [ 730.158075] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 730.158075] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] image_cache(vi, tmp_image_ds_loc) [ 730.158075] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 730.158445] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] vm_util.copy_virtual_disk( [ 730.158445] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 730.158445] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] session._wait_for_task(vmdk_copy_task) [ 730.158445] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 730.158445] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] return self.wait_for_task(task_ref) [ 730.158445] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 730.158445] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] return evt.wait() [ 730.158445] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 730.158445] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] result = hub.switch() [ 730.158445] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 730.158445] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] return self.greenlet.switch() [ 730.158445] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 730.158445] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] self.f(*self.args, **self.kw) [ 730.158784] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 730.158784] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] raise exceptions.translate_fault(task_info.error) [ 730.158784] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 730.158784] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Faults: ['InvalidArgument'] [ 730.158784] env[60400]: ERROR nova.compute.manager [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] [ 730.158784] env[60400]: DEBUG nova.compute.utils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] VimFaultException {{(pid=60400) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 730.160302] env[60400]: DEBUG nova.compute.manager [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Build of instance 30c40353-01fe-407d-8d56-0f6c166d12e3 was re-scheduled: A specified parameter was not correct: fileType [ 730.160302] env[60400]: Faults: ['InvalidArgument'] {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 730.160727] env[60400]: DEBUG nova.compute.manager [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Unplugging VIFs for instance {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 730.160901] env[60400]: DEBUG nova.compute.manager [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 730.161056] env[60400]: DEBUG nova.compute.manager [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Deallocating network for instance {{(pid=60400) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 730.161215] env[60400]: DEBUG nova.network.neutron [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] deallocate_for_instance() {{(pid=60400) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 730.790288] env[60400]: DEBUG nova.network.neutron [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Updating instance_info_cache with network_info: [] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 730.805187] env[60400]: INFO nova.compute.manager [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Took 0.64 seconds to deallocate network for instance. [ 730.910509] env[60400]: INFO nova.scheduler.client.report [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Deleted allocations for instance 30c40353-01fe-407d-8d56-0f6c166d12e3 [ 730.935883] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Lock "30c40353-01fe-407d-8d56-0f6c166d12e3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 103.330s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 730.977402] env[60400]: DEBUG nova.compute.manager [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Starting instance... {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 731.052889] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 731.052889] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 731.054333] env[60400]: INFO nova.compute.claims [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 731.366820] env[60400]: DEBUG oslo_concurrency.lockutils [None req-b4c28495-2ae6-4055-b784-b74f117db807 tempest-ServersV294TestFqdnHostnames-541059978 tempest-ServersV294TestFqdnHostnames-541059978-project-member] Acquiring lock "daf1f034-cac9-44a9-8fdd-0c4c2d8eaa84" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 731.366820] env[60400]: DEBUG oslo_concurrency.lockutils [None req-b4c28495-2ae6-4055-b784-b74f117db807 tempest-ServersV294TestFqdnHostnames-541059978 tempest-ServersV294TestFqdnHostnames-541059978-project-member] Lock "daf1f034-cac9-44a9-8fdd-0c4c2d8eaa84" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 731.406185] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4164eb17-f0fb-4b96-a02d-16369f7b08be {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.414515] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aef4fff6-0aec-4c87-8cab-b8c600527bfc {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.448020] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df307e41-7186-4af1-8c5f-64c797b28321 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.456398] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb136a70-6269-40d3-bc7a-572e63cb5c26 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.473509] env[60400]: DEBUG nova.compute.provider_tree [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 731.482678] env[60400]: DEBUG nova.scheduler.client.report [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 731.500777] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.448s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 731.501723] env[60400]: DEBUG nova.compute.manager [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Start building networks asynchronously for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 731.541777] env[60400]: DEBUG nova.compute.utils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Using /dev/sd instead of None {{(pid=60400) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 731.544379] env[60400]: DEBUG nova.compute.manager [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Allocating IP information in the background. {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 731.544379] env[60400]: DEBUG nova.network.neutron [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] allocate_for_instance() {{(pid=60400) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 731.555722] env[60400]: DEBUG nova.compute.manager [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Start building block device mappings for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 731.634397] env[60400]: DEBUG nova.compute.manager [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Start spawning the instance on the hypervisor. {{(pid=60400) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 731.659113] env[60400]: DEBUG nova.virt.hardware [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T04:32:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T04:32:17Z,direct_url=,disk_format='vmdk',id=f5dfd970-7a56-4489-873c-2c3b6fbd9fe9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c82f07917ba4819a6bcf09e15f9f9cf',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T04:32:18Z,virtual_size=,visibility=), allow threads: False {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 731.659434] env[60400]: DEBUG nova.virt.hardware [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Flavor limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 731.659654] env[60400]: DEBUG nova.virt.hardware [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Image limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 731.659847] env[60400]: DEBUG nova.virt.hardware [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Flavor pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 731.659989] env[60400]: DEBUG nova.virt.hardware [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Image pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 731.660147] env[60400]: DEBUG nova.virt.hardware [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 731.660542] env[60400]: DEBUG nova.virt.hardware [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 731.660788] env[60400]: DEBUG nova.virt.hardware [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 731.660963] env[60400]: DEBUG nova.virt.hardware [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Got 1 possible topologies {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 731.661137] env[60400]: DEBUG nova.virt.hardware [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 731.661303] env[60400]: DEBUG nova.virt.hardware [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 731.662318] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-113fd180-1711-4eb1-ace2-b41cf1299fbb {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.672376] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67285416-b803-4ff8-ba52-e6ce22eb158a {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.888757] env[60400]: DEBUG nova.policy [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0eaa3d12fe1b4a33b50f985d0fe081fa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2140598201444851ab98084d07307c86', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60400) authorize /opt/stack/nova/nova/policy.py:203}} [ 732.823178] env[60400]: DEBUG nova.network.neutron [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Successfully created port: 777d624e-2007-42dc-b553-d6efc26d590f {{(pid=60400) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 734.586811] env[60400]: DEBUG nova.network.neutron [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Successfully updated port: 777d624e-2007-42dc-b553-d6efc26d590f {{(pid=60400) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 734.597141] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Acquiring lock "refresh_cache-95f71b47-73c8-4a82-b806-f6f2ed9cdbb3" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 734.597289] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Acquired lock "refresh_cache-95f71b47-73c8-4a82-b806-f6f2ed9cdbb3" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 734.597435] env[60400]: DEBUG nova.network.neutron [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Building network info cache for instance {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} [ 734.683834] env[60400]: DEBUG nova.network.neutron [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Instance cache missing network info. {{(pid=60400) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 735.023380] env[60400]: DEBUG nova.network.neutron [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Updating instance_info_cache with network_info: [{"id": "777d624e-2007-42dc-b553-d6efc26d590f", "address": "fa:16:3e:6e:f8:da", "network": {"id": "8e664329-6cf8-471d-9917-498cc1bdf003", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1102702079-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2140598201444851ab98084d07307c86", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9722ea4d-e4a5-48fc-b759-5c4c4796b1ef", "external-id": "nsx-vlan-transportzone-924", "segmentation_id": 924, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap777d624e-20", "ovs_interfaceid": "777d624e-2007-42dc-b553-d6efc26d590f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 735.039017] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Releasing lock "refresh_cache-95f71b47-73c8-4a82-b806-f6f2ed9cdbb3" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 735.039017] env[60400]: DEBUG nova.compute.manager [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Instance network_info: |[{"id": "777d624e-2007-42dc-b553-d6efc26d590f", "address": "fa:16:3e:6e:f8:da", "network": {"id": "8e664329-6cf8-471d-9917-498cc1bdf003", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1102702079-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2140598201444851ab98084d07307c86", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9722ea4d-e4a5-48fc-b759-5c4c4796b1ef", "external-id": "nsx-vlan-transportzone-924", "segmentation_id": 924, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap777d624e-20", "ovs_interfaceid": "777d624e-2007-42dc-b553-d6efc26d590f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 735.039251] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:6e:f8:da', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '9722ea4d-e4a5-48fc-b759-5c4c4796b1ef', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '777d624e-2007-42dc-b553-d6efc26d590f', 'vif_model': 'vmxnet3'}] {{(pid=60400) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 735.046466] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Creating folder: Project (2140598201444851ab98084d07307c86). Parent ref: group-v119075. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 735.047188] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-37e9a259-6873-4d65-98d2-2704770734f4 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.065939] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Created folder: Project (2140598201444851ab98084d07307c86) in parent group-v119075. [ 735.065939] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Creating folder: Instances. Parent ref: group-v119113. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 735.065939] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-136d2d2c-868c-4882-a588-3d78beafedf2 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.073934] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Created folder: Instances in parent group-v119113. [ 735.074222] env[60400]: DEBUG oslo.service.loopingcall [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60400) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 735.074562] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Creating VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 735.074639] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-5c5eb10d-46af-4a35-ad6f-9ab46ec676c8 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.095819] env[60400]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 735.095819] env[60400]: value = "task-449803" [ 735.095819] env[60400]: _type = "Task" [ 735.095819] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 735.104205] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449803, 'name': CreateVM_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 735.584463] env[60400]: DEBUG nova.compute.manager [req-65774850-ccc4-4924-88c2-7dc5af4a461b req-77d2a8ec-a86c-4086-a94e-6df212522599 service nova] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Received event network-vif-plugged-777d624e-2007-42dc-b553-d6efc26d590f {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 735.584763] env[60400]: DEBUG oslo_concurrency.lockutils [req-65774850-ccc4-4924-88c2-7dc5af4a461b req-77d2a8ec-a86c-4086-a94e-6df212522599 service nova] Acquiring lock "95f71b47-73c8-4a82-b806-f6f2ed9cdbb3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.585478] env[60400]: DEBUG oslo_concurrency.lockutils [req-65774850-ccc4-4924-88c2-7dc5af4a461b req-77d2a8ec-a86c-4086-a94e-6df212522599 service nova] Lock "95f71b47-73c8-4a82-b806-f6f2ed9cdbb3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.585933] env[60400]: DEBUG oslo_concurrency.lockutils [req-65774850-ccc4-4924-88c2-7dc5af4a461b req-77d2a8ec-a86c-4086-a94e-6df212522599 service nova] Lock "95f71b47-73c8-4a82-b806-f6f2ed9cdbb3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.585933] env[60400]: DEBUG nova.compute.manager [req-65774850-ccc4-4924-88c2-7dc5af4a461b req-77d2a8ec-a86c-4086-a94e-6df212522599 service nova] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] No waiting events found dispatching network-vif-plugged-777d624e-2007-42dc-b553-d6efc26d590f {{(pid=60400) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 735.586085] env[60400]: WARNING nova.compute.manager [req-65774850-ccc4-4924-88c2-7dc5af4a461b req-77d2a8ec-a86c-4086-a94e-6df212522599 service nova] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Received unexpected event network-vif-plugged-777d624e-2007-42dc-b553-d6efc26d590f for instance with vm_state building and task_state spawning. [ 735.608027] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449803, 'name': CreateVM_Task, 'duration_secs': 0.29412} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 735.608490] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Created VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 735.609380] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 735.610008] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 735.610429] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 735.611098] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c1775c48-1473-48ab-afaf-9c46b9e7842d {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.617164] env[60400]: DEBUG oslo_vmware.api [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Waiting for the task: (returnval){ [ 735.617164] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]524849c9-4ef9-c13d-6eb1-83da3df81071" [ 735.617164] env[60400]: _type = "Task" [ 735.617164] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 735.628075] env[60400]: DEBUG oslo_vmware.api [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]524849c9-4ef9-c13d-6eb1-83da3df81071, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 735.656784] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Acquiring lock "b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.656996] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Lock "b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.905393] env[60400]: DEBUG oslo_concurrency.lockutils [None req-ef4b706f-4629-44f3-be39-af08c87a8497 tempest-ImagesOneServerTestJSON-749254656 tempest-ImagesOneServerTestJSON-749254656-project-member] Acquiring lock "49aaf98b-945e-4c5d-8158-641b8650a8a7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.905393] env[60400]: DEBUG oslo_concurrency.lockutils [None req-ef4b706f-4629-44f3-be39-af08c87a8497 tempest-ImagesOneServerTestJSON-749254656 tempest-ImagesOneServerTestJSON-749254656-project-member] Lock "49aaf98b-945e-4c5d-8158-641b8650a8a7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 736.128246] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 736.128586] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Processing image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 736.128754] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 737.394037] env[60400]: DEBUG oslo_concurrency.lockutils [None req-bb0f1347-ba84-4ee0-b6e1-a08a8a353154 tempest-ServerTagsTestJSON-1401197038 tempest-ServerTagsTestJSON-1401197038-project-member] Acquiring lock "cb7a8413-4414-4de6-8d4f-9ac4f1784f35" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 737.394290] env[60400]: DEBUG oslo_concurrency.lockutils [None req-bb0f1347-ba84-4ee0-b6e1-a08a8a353154 tempest-ServerTagsTestJSON-1401197038 tempest-ServerTagsTestJSON-1401197038-project-member] Lock "cb7a8413-4414-4de6-8d4f-9ac4f1784f35" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 737.718105] env[60400]: DEBUG nova.compute.manager [req-d2354e58-05cf-4391-a9f1-8ae7c17f2d4d req-8cad7963-6a50-4f0c-afa4-0cfb41f9e04a service nova] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Received event network-changed-777d624e-2007-42dc-b553-d6efc26d590f {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 737.718105] env[60400]: DEBUG nova.compute.manager [req-d2354e58-05cf-4391-a9f1-8ae7c17f2d4d req-8cad7963-6a50-4f0c-afa4-0cfb41f9e04a service nova] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Refreshing instance network info cache due to event network-changed-777d624e-2007-42dc-b553-d6efc26d590f. {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 737.718453] env[60400]: DEBUG oslo_concurrency.lockutils [req-d2354e58-05cf-4391-a9f1-8ae7c17f2d4d req-8cad7963-6a50-4f0c-afa4-0cfb41f9e04a service nova] Acquiring lock "refresh_cache-95f71b47-73c8-4a82-b806-f6f2ed9cdbb3" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 737.718724] env[60400]: DEBUG oslo_concurrency.lockutils [req-d2354e58-05cf-4391-a9f1-8ae7c17f2d4d req-8cad7963-6a50-4f0c-afa4-0cfb41f9e04a service nova] Acquired lock "refresh_cache-95f71b47-73c8-4a82-b806-f6f2ed9cdbb3" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 737.718990] env[60400]: DEBUG nova.network.neutron [req-d2354e58-05cf-4391-a9f1-8ae7c17f2d4d req-8cad7963-6a50-4f0c-afa4-0cfb41f9e04a service nova] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Refreshing network info cache for port 777d624e-2007-42dc-b553-d6efc26d590f {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} [ 738.078902] env[60400]: DEBUG nova.network.neutron [req-d2354e58-05cf-4391-a9f1-8ae7c17f2d4d req-8cad7963-6a50-4f0c-afa4-0cfb41f9e04a service nova] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Updated VIF entry in instance network info cache for port 777d624e-2007-42dc-b553-d6efc26d590f. {{(pid=60400) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} [ 738.079387] env[60400]: DEBUG nova.network.neutron [req-d2354e58-05cf-4391-a9f1-8ae7c17f2d4d req-8cad7963-6a50-4f0c-afa4-0cfb41f9e04a service nova] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Updating instance_info_cache with network_info: [{"id": "777d624e-2007-42dc-b553-d6efc26d590f", "address": "fa:16:3e:6e:f8:da", "network": {"id": "8e664329-6cf8-471d-9917-498cc1bdf003", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1102702079-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2140598201444851ab98084d07307c86", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9722ea4d-e4a5-48fc-b759-5c4c4796b1ef", "external-id": "nsx-vlan-transportzone-924", "segmentation_id": 924, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap777d624e-20", "ovs_interfaceid": "777d624e-2007-42dc-b553-d6efc26d590f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 738.092104] env[60400]: DEBUG oslo_concurrency.lockutils [req-d2354e58-05cf-4391-a9f1-8ae7c17f2d4d req-8cad7963-6a50-4f0c-afa4-0cfb41f9e04a service nova] Releasing lock "refresh_cache-95f71b47-73c8-4a82-b806-f6f2ed9cdbb3" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 741.074361] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f6e0da48-d31d-4d57-a3a8-7b964135e1c7 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Acquiring lock "01b62d6f-6718-45b4-8f67-cdb77c5f4bd0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 741.074361] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f6e0da48-d31d-4d57-a3a8-7b964135e1c7 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Lock "01b62d6f-6718-45b4-8f67-cdb77c5f4bd0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 758.935638] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 759.932818] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 760.928531] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 760.932153] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 760.932344] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 760.932491] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 760.942390] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 760.942600] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 760.942752] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 760.942910] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60400) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 760.944055] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f57f550-f640-4db9-b3ed-a2c4f51d77aa {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.953162] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87ebbd9e-2046-4bd1-858d-bf6a20451adb {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.967061] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19ac166c-3b79-453e-9ca0-eb917d63bf74 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.973398] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea210d68-5f6d-4bf9-949a-4d0fb556b494 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 761.003034] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181791MB free_disk=118GB free_vcpus=48 pci_devices=None {{(pid=60400) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 761.003034] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 761.003174] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 761.083917] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 130961ce-1e22-4320-abc9-30fc5f652be3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 761.084093] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 4540cd82-440c-41e3-8bfa-b384da6fc964 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 761.084221] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance a45f24ab-afe1-4ffd-a917-11b68a0b29ec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 761.084342] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 65bf8cf0-825c-42d8-bd78-62a6277d29d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 761.084459] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance cc1d534d-6a43-4575-895d-c3bef84d772e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 761.084568] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance e4f0342a-4169-40aa-b234-a2e2340d5b05 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 761.084682] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance f202a181-b5ea-4b06-91ad-86356b51e088 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 761.084819] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 63151ec9-f383-46cc-ac57-c3f7f1569410 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 761.084938] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance c5b391a9-7969-4119-9bc6-b0e1fe7a9713 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 761.085061] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 761.107937] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 7476fb96-5247-472c-ab92-ef7e5916cb00 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 761.130773] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 35630c7b-fdf4-4d6d-8e5a-0045f1387f93 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 761.140857] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 837197c0-9ff8-45a2-8bf0-730158a43a17 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 761.151282] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 56471a78-08cd-4d1a-b3f5-d1eac277183e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 761.164363] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 1240824e-c5f1-4517-b182-20245311c687 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 761.174841] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance daf1f034-cac9-44a9-8fdd-0c4c2d8eaa84 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 761.184449] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 761.195533] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 49aaf98b-945e-4c5d-8158-641b8650a8a7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 761.205604] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance cb7a8413-4414-4de6-8d4f-9ac4f1784f35 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 761.214439] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 01b62d6f-6718-45b4-8f67-cdb77c5f4bd0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 761.214662] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60400) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 761.214807] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60400) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 761.444267] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09146f32-44f9-442b-9b00-a1d6209b1688 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 761.452177] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aca89e5a-60ac-4142-9049-7491c9df9f18 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 761.482881] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d2e03b1-e009-46a7-a7cc-b6269700af49 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 761.490277] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6656664-68d3-44c5-a544-4fb89fca287e {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 761.504555] env[60400]: DEBUG nova.compute.provider_tree [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 761.514788] env[60400]: DEBUG nova.scheduler.client.report [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 761.528805] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60400) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 761.528978] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.526s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 762.529909] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 762.529909] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Starting heal instance info cache {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 762.529909] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Rebuilding the list of instances to heal {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 762.550715] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 762.550899] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 762.551015] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 762.551131] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 762.551251] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 762.551368] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 762.551486] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 762.551601] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 762.551715] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 762.551829] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 762.551947] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Didn't find any instances for network info cache update. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 762.552412] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 762.552586] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 762.552753] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60400) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 777.086080] env[60400]: WARNING oslo_vmware.rw_handles [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 777.086080] env[60400]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 777.086080] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 777.086080] env[60400]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 777.086080] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 777.086080] env[60400]: ERROR oslo_vmware.rw_handles response.begin() [ 777.086080] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 777.086080] env[60400]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 777.086080] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 777.086080] env[60400]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 777.086080] env[60400]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 777.086080] env[60400]: ERROR oslo_vmware.rw_handles [ 777.086650] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Downloaded image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to vmware_temp/ae2f6962-75e2-4820-b721-c009efd05ad6/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 777.087971] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Caching image {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 777.088225] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Copying Virtual Disk [datastore1] vmware_temp/ae2f6962-75e2-4820-b721-c009efd05ad6/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk to [datastore1] vmware_temp/ae2f6962-75e2-4820-b721-c009efd05ad6/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk {{(pid=60400) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 777.088494] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-8909bca5-03c1-4ce9-8f41-5da459475f86 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 777.097950] env[60400]: DEBUG oslo_vmware.api [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Waiting for the task: (returnval){ [ 777.097950] env[60400]: value = "task-449804" [ 777.097950] env[60400]: _type = "Task" [ 777.097950] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 777.105793] env[60400]: DEBUG oslo_vmware.api [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Task: {'id': task-449804, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 777.608380] env[60400]: DEBUG oslo_vmware.exceptions [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Fault InvalidArgument not matched. {{(pid=60400) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 777.608621] env[60400]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 777.609173] env[60400]: ERROR nova.compute.manager [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 777.609173] env[60400]: Faults: ['InvalidArgument'] [ 777.609173] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Traceback (most recent call last): [ 777.609173] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 777.609173] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] yield resources [ 777.609173] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 777.609173] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] self.driver.spawn(context, instance, image_meta, [ 777.609173] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 777.609173] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 777.609173] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 777.609173] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] self._fetch_image_if_missing(context, vi) [ 777.609173] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 777.609488] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] image_cache(vi, tmp_image_ds_loc) [ 777.609488] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 777.609488] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] vm_util.copy_virtual_disk( [ 777.609488] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 777.609488] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] session._wait_for_task(vmdk_copy_task) [ 777.609488] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 777.609488] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] return self.wait_for_task(task_ref) [ 777.609488] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 777.609488] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] return evt.wait() [ 777.609488] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 777.609488] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] result = hub.switch() [ 777.609488] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 777.609488] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] return self.greenlet.switch() [ 777.609923] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 777.609923] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] self.f(*self.args, **self.kw) [ 777.609923] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 777.609923] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] raise exceptions.translate_fault(task_info.error) [ 777.609923] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 777.609923] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Faults: ['InvalidArgument'] [ 777.609923] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] [ 777.609923] env[60400]: INFO nova.compute.manager [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Terminating instance [ 777.611013] env[60400]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 777.611221] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 777.611452] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5a6e7ce9-533f-4987-b158-e31af9b3ad25 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 777.615318] env[60400]: DEBUG nova.compute.manager [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Start destroying the instance on the hypervisor. {{(pid=60400) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 777.615508] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Destroying instance {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 777.616230] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23cb90c8-ab38-488a-a746-f6f4dbec1bfd {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 777.621036] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 777.621036] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60400) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 777.621036] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-71628512-4a32-4e64-bea0-a00719499338 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 777.625091] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Unregistering the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 777.625593] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-45eaa6a5-71ff-4df8-a8a7-9503de18bd26 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 777.627841] env[60400]: DEBUG oslo_vmware.api [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Waiting for the task: (returnval){ [ 777.627841] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]5206bff9-fd7c-6840-6a58-aa601323148d" [ 777.627841] env[60400]: _type = "Task" [ 777.627841] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 777.635876] env[60400]: DEBUG oslo_vmware.api [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]5206bff9-fd7c-6840-6a58-aa601323148d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 777.695440] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Unregistered the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 777.695440] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Deleting contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 777.695617] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Deleting the datastore file [datastore1] 130961ce-1e22-4320-abc9-30fc5f652be3 {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 777.695877] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c797c0ee-ad2e-45ea-a420-e2930e7484d7 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 777.702090] env[60400]: DEBUG oslo_vmware.api [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Waiting for the task: (returnval){ [ 777.702090] env[60400]: value = "task-449806" [ 777.702090] env[60400]: _type = "Task" [ 777.702090] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 777.709625] env[60400]: DEBUG oslo_vmware.api [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Task: {'id': task-449806, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 778.138657] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Preparing fetch location {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 778.138993] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Creating directory with path [datastore1] vmware_temp/cea983d3-1f85-4946-82f1-6e4328a65e3b/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 778.139169] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-aceacd44-7e5d-4082-a2a6-16728d0129a2 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 778.150790] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Created directory with path [datastore1] vmware_temp/cea983d3-1f85-4946-82f1-6e4328a65e3b/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 778.151055] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Fetch image to [datastore1] vmware_temp/cea983d3-1f85-4946-82f1-6e4328a65e3b/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 778.151264] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to [datastore1] vmware_temp/cea983d3-1f85-4946-82f1-6e4328a65e3b/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 778.152078] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05b6ea80-c7d0-4cea-88aa-6d1185ed158c {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 778.158919] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c737be3e-8847-4972-876a-41ab3937967e {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 778.168216] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97b596ff-d477-442d-92fe-7359210e773c {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 778.199664] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b97df202-703e-4209-ad67-c1ebe9aa1804 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 778.207600] env[60400]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e95c5029-9a14-4d54-bece-24d81f581a69 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 778.211943] env[60400]: DEBUG oslo_vmware.api [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Task: {'id': task-449806, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.059901} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 778.212509] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Deleted the datastore file {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 778.212728] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Deleted contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 778.212951] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Instance destroyed {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 778.213195] env[60400]: INFO nova.compute.manager [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Took 0.60 seconds to destroy the instance on the hypervisor. [ 778.215383] env[60400]: DEBUG nova.compute.claims [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Aborting claim: {{(pid=60400) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 778.215592] env[60400]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 778.215835] env[60400]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 778.231635] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 778.281642] env[60400]: DEBUG oslo_vmware.rw_handles [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cea983d3-1f85-4946-82f1-6e4328a65e3b/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60400) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 778.337386] env[60400]: DEBUG oslo_vmware.rw_handles [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Completed reading data from the image iterator. {{(pid=60400) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 778.337589] env[60400]: DEBUG oslo_vmware.rw_handles [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cea983d3-1f85-4946-82f1-6e4328a65e3b/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60400) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 778.527074] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a8bda4a-7932-4e15-9edc-c203a3538cd1 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 778.534081] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7af83396-1a8e-4226-b2aa-3219ff7be959 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 778.563916] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ee4661e-1438-4ebb-84cb-b8cbd5c1d3eb {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 778.570741] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf97e5c0-c6b9-4104-bf0e-97cfa4c96e61 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 778.583533] env[60400]: DEBUG nova.compute.provider_tree [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 778.591803] env[60400]: DEBUG nova.scheduler.client.report [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 778.604640] env[60400]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.389s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 778.605168] env[60400]: ERROR nova.compute.manager [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 778.605168] env[60400]: Faults: ['InvalidArgument'] [ 778.605168] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Traceback (most recent call last): [ 778.605168] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 778.605168] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] self.driver.spawn(context, instance, image_meta, [ 778.605168] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 778.605168] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 778.605168] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 778.605168] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] self._fetch_image_if_missing(context, vi) [ 778.605168] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 778.605168] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] image_cache(vi, tmp_image_ds_loc) [ 778.605168] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 778.605482] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] vm_util.copy_virtual_disk( [ 778.605482] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 778.605482] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] session._wait_for_task(vmdk_copy_task) [ 778.605482] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 778.605482] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] return self.wait_for_task(task_ref) [ 778.605482] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 778.605482] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] return evt.wait() [ 778.605482] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 778.605482] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] result = hub.switch() [ 778.605482] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 778.605482] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] return self.greenlet.switch() [ 778.605482] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 778.605482] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] self.f(*self.args, **self.kw) [ 778.605783] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 778.605783] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] raise exceptions.translate_fault(task_info.error) [ 778.605783] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 778.605783] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Faults: ['InvalidArgument'] [ 778.605783] env[60400]: ERROR nova.compute.manager [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] [ 778.605899] env[60400]: DEBUG nova.compute.utils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] VimFaultException {{(pid=60400) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 778.607151] env[60400]: DEBUG nova.compute.manager [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Build of instance 130961ce-1e22-4320-abc9-30fc5f652be3 was re-scheduled: A specified parameter was not correct: fileType [ 778.607151] env[60400]: Faults: ['InvalidArgument'] {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 778.607636] env[60400]: DEBUG nova.compute.manager [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Unplugging VIFs for instance {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 778.607835] env[60400]: DEBUG nova.compute.manager [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 778.608015] env[60400]: DEBUG nova.compute.manager [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Deallocating network for instance {{(pid=60400) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 778.608180] env[60400]: DEBUG nova.network.neutron [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] deallocate_for_instance() {{(pid=60400) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 779.051417] env[60400]: DEBUG nova.network.neutron [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Updating instance_info_cache with network_info: [] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 779.062333] env[60400]: INFO nova.compute.manager [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Took 0.45 seconds to deallocate network for instance. [ 779.148486] env[60400]: INFO nova.scheduler.client.report [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Deleted allocations for instance 130961ce-1e22-4320-abc9-30fc5f652be3 [ 779.167625] env[60400]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Lock "130961ce-1e22-4320-abc9-30fc5f652be3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 149.716s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 779.184715] env[60400]: DEBUG nova.compute.manager [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Starting instance... {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 779.240439] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 779.240711] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 779.242182] env[60400]: INFO nova.compute.claims [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 779.492225] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f5d3782-7d5c-4487-a22c-299521dc05d1 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 779.499763] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f1ac172-5b3c-4306-ade3-ab27117d9533 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 779.531261] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a6e2779-bd75-4102-9be8-04ea04d58e64 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 779.537613] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa2fda85-5571-4ef8-a187-aeeae11266dc {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 779.551293] env[60400]: DEBUG nova.compute.provider_tree [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 779.559590] env[60400]: DEBUG nova.scheduler.client.report [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 779.575828] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.335s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 779.576291] env[60400]: DEBUG nova.compute.manager [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Start building networks asynchronously for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 779.606942] env[60400]: DEBUG nova.compute.utils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Using /dev/sd instead of None {{(pid=60400) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 779.608259] env[60400]: DEBUG nova.compute.manager [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Allocating IP information in the background. {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 779.608435] env[60400]: DEBUG nova.network.neutron [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] allocate_for_instance() {{(pid=60400) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 779.616331] env[60400]: DEBUG nova.compute.manager [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Start building block device mappings for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 779.680328] env[60400]: DEBUG nova.policy [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e80ef72400df48b9b6a2b8b62fad4d5b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4714528fb7fb41eb908a9bda448bdffc', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60400) authorize /opt/stack/nova/nova/policy.py:203}} [ 779.683863] env[60400]: DEBUG nova.compute.manager [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Start spawning the instance on the hypervisor. {{(pid=60400) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 779.703832] env[60400]: DEBUG nova.virt.hardware [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T04:32:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T04:32:17Z,direct_url=,disk_format='vmdk',id=f5dfd970-7a56-4489-873c-2c3b6fbd9fe9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c82f07917ba4819a6bcf09e15f9f9cf',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T04:32:18Z,virtual_size=,visibility=), allow threads: False {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 779.704041] env[60400]: DEBUG nova.virt.hardware [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Flavor limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 779.704197] env[60400]: DEBUG nova.virt.hardware [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Image limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 779.704370] env[60400]: DEBUG nova.virt.hardware [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Flavor pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 779.704509] env[60400]: DEBUG nova.virt.hardware [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Image pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 779.704649] env[60400]: DEBUG nova.virt.hardware [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 779.704842] env[60400]: DEBUG nova.virt.hardware [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 779.704991] env[60400]: DEBUG nova.virt.hardware [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 779.705240] env[60400]: DEBUG nova.virt.hardware [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Got 1 possible topologies {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 779.705406] env[60400]: DEBUG nova.virt.hardware [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 779.705572] env[60400]: DEBUG nova.virt.hardware [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 779.706400] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abfbd9d8-2f8d-4acb-b3f4-1291f4ffcb93 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 779.714574] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-754e7bd3-f1c2-44f6-801e-a4fd501f41cb {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 780.102318] env[60400]: DEBUG nova.network.neutron [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Successfully created port: cbe23c5f-783c-4a2c-9f5e-e7305fdcbea9 {{(pid=60400) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 781.398816] env[60400]: DEBUG nova.network.neutron [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Successfully updated port: cbe23c5f-783c-4a2c-9f5e-e7305fdcbea9 {{(pid=60400) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 781.408995] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Acquiring lock "refresh_cache-7476fb96-5247-472c-ab92-ef7e5916cb00" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 781.408995] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Acquired lock "refresh_cache-7476fb96-5247-472c-ab92-ef7e5916cb00" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 781.408995] env[60400]: DEBUG nova.network.neutron [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Building network info cache for instance {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} [ 781.508498] env[60400]: DEBUG nova.network.neutron [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Instance cache missing network info. {{(pid=60400) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 781.701122] env[60400]: DEBUG nova.compute.manager [req-34b6ab2f-bf4b-4a5a-856d-f0a906da9e97 req-0acc767f-c6ef-4819-9aab-e442e180a586 service nova] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Received event network-vif-plugged-cbe23c5f-783c-4a2c-9f5e-e7305fdcbea9 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 781.701122] env[60400]: DEBUG oslo_concurrency.lockutils [req-34b6ab2f-bf4b-4a5a-856d-f0a906da9e97 req-0acc767f-c6ef-4819-9aab-e442e180a586 service nova] Acquiring lock "7476fb96-5247-472c-ab92-ef7e5916cb00-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 781.701122] env[60400]: DEBUG oslo_concurrency.lockutils [req-34b6ab2f-bf4b-4a5a-856d-f0a906da9e97 req-0acc767f-c6ef-4819-9aab-e442e180a586 service nova] Lock "7476fb96-5247-472c-ab92-ef7e5916cb00-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 781.701122] env[60400]: DEBUG oslo_concurrency.lockutils [req-34b6ab2f-bf4b-4a5a-856d-f0a906da9e97 req-0acc767f-c6ef-4819-9aab-e442e180a586 service nova] Lock "7476fb96-5247-472c-ab92-ef7e5916cb00-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 781.701316] env[60400]: DEBUG nova.compute.manager [req-34b6ab2f-bf4b-4a5a-856d-f0a906da9e97 req-0acc767f-c6ef-4819-9aab-e442e180a586 service nova] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] No waiting events found dispatching network-vif-plugged-cbe23c5f-783c-4a2c-9f5e-e7305fdcbea9 {{(pid=60400) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 781.701707] env[60400]: WARNING nova.compute.manager [req-34b6ab2f-bf4b-4a5a-856d-f0a906da9e97 req-0acc767f-c6ef-4819-9aab-e442e180a586 service nova] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Received unexpected event network-vif-plugged-cbe23c5f-783c-4a2c-9f5e-e7305fdcbea9 for instance with vm_state building and task_state spawning. [ 781.917853] env[60400]: DEBUG nova.network.neutron [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Updating instance_info_cache with network_info: [{"id": "cbe23c5f-783c-4a2c-9f5e-e7305fdcbea9", "address": "fa:16:3e:c9:00:54", "network": {"id": "de1f72d9-9c92-40d9-aaf1-3b1d12142eba", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-317054403-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4714528fb7fb41eb908a9bda448bdffc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d6fab536-1e48-4d07-992a-076f0e6d089c", "external-id": "nsx-vlan-transportzone-61", "segmentation_id": 61, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcbe23c5f-78", "ovs_interfaceid": "cbe23c5f-783c-4a2c-9f5e-e7305fdcbea9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 781.931331] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Releasing lock "refresh_cache-7476fb96-5247-472c-ab92-ef7e5916cb00" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 781.931898] env[60400]: DEBUG nova.compute.manager [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Instance network_info: |[{"id": "cbe23c5f-783c-4a2c-9f5e-e7305fdcbea9", "address": "fa:16:3e:c9:00:54", "network": {"id": "de1f72d9-9c92-40d9-aaf1-3b1d12142eba", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-317054403-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4714528fb7fb41eb908a9bda448bdffc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d6fab536-1e48-4d07-992a-076f0e6d089c", "external-id": "nsx-vlan-transportzone-61", "segmentation_id": 61, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcbe23c5f-78", "ovs_interfaceid": "cbe23c5f-783c-4a2c-9f5e-e7305fdcbea9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 781.934015] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c9:00:54', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd6fab536-1e48-4d07-992a-076f0e6d089c', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'cbe23c5f-783c-4a2c-9f5e-e7305fdcbea9', 'vif_model': 'vmxnet3'}] {{(pid=60400) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 781.941233] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Creating folder: Project (4714528fb7fb41eb908a9bda448bdffc). Parent ref: group-v119075. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 781.941897] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1131d148-193c-440a-b928-1dfdbb4a98a5 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 781.954220] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Created folder: Project (4714528fb7fb41eb908a9bda448bdffc) in parent group-v119075. [ 781.954741] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Creating folder: Instances. Parent ref: group-v119116. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 781.956018] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-363e90e7-3804-4f98-9705-b9532b729e64 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 781.965975] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Created folder: Instances in parent group-v119116. [ 781.965975] env[60400]: DEBUG oslo.service.loopingcall [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60400) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 781.965975] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Creating VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 781.965975] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ddbb4038-52d5-45e6-9a76-c452f5b8a51d {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 782.003494] env[60400]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 782.003494] env[60400]: value = "task-449809" [ 782.003494] env[60400]: _type = "Task" [ 782.003494] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 782.015550] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449809, 'name': CreateVM_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 782.514503] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449809, 'name': CreateVM_Task, 'duration_secs': 0.264578} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 782.515083] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Created VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 782.515874] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 782.516219] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 782.516678] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 782.517048] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f64dc21a-988b-4e5e-96ac-0cbc53d4be8d {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 782.524035] env[60400]: DEBUG oslo_vmware.api [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Waiting for the task: (returnval){ [ 782.524035] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]5299c77b-3d62-4955-0abd-bf1745b0e842" [ 782.524035] env[60400]: _type = "Task" [ 782.524035] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 782.530575] env[60400]: DEBUG oslo_vmware.api [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]5299c77b-3d62-4955-0abd-bf1745b0e842, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 783.034798] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 783.034798] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Processing image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 783.034798] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 783.727549] env[60400]: DEBUG nova.compute.manager [req-bc805cd6-38ac-4b5e-b7af-50c84aa91294 req-c3f55302-8b6d-4097-b71d-62e6e1f081a0 service nova] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Received event network-changed-cbe23c5f-783c-4a2c-9f5e-e7305fdcbea9 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 783.730592] env[60400]: DEBUG nova.compute.manager [req-bc805cd6-38ac-4b5e-b7af-50c84aa91294 req-c3f55302-8b6d-4097-b71d-62e6e1f081a0 service nova] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Refreshing instance network info cache due to event network-changed-cbe23c5f-783c-4a2c-9f5e-e7305fdcbea9. {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 783.730592] env[60400]: DEBUG oslo_concurrency.lockutils [req-bc805cd6-38ac-4b5e-b7af-50c84aa91294 req-c3f55302-8b6d-4097-b71d-62e6e1f081a0 service nova] Acquiring lock "refresh_cache-7476fb96-5247-472c-ab92-ef7e5916cb00" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 783.730592] env[60400]: DEBUG oslo_concurrency.lockutils [req-bc805cd6-38ac-4b5e-b7af-50c84aa91294 req-c3f55302-8b6d-4097-b71d-62e6e1f081a0 service nova] Acquired lock "refresh_cache-7476fb96-5247-472c-ab92-ef7e5916cb00" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 783.730592] env[60400]: DEBUG nova.network.neutron [req-bc805cd6-38ac-4b5e-b7af-50c84aa91294 req-c3f55302-8b6d-4097-b71d-62e6e1f081a0 service nova] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Refreshing network info cache for port cbe23c5f-783c-4a2c-9f5e-e7305fdcbea9 {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} [ 784.214885] env[60400]: DEBUG nova.network.neutron [req-bc805cd6-38ac-4b5e-b7af-50c84aa91294 req-c3f55302-8b6d-4097-b71d-62e6e1f081a0 service nova] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Updated VIF entry in instance network info cache for port cbe23c5f-783c-4a2c-9f5e-e7305fdcbea9. {{(pid=60400) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} [ 784.214885] env[60400]: DEBUG nova.network.neutron [req-bc805cd6-38ac-4b5e-b7af-50c84aa91294 req-c3f55302-8b6d-4097-b71d-62e6e1f081a0 service nova] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Updating instance_info_cache with network_info: [{"id": "cbe23c5f-783c-4a2c-9f5e-e7305fdcbea9", "address": "fa:16:3e:c9:00:54", "network": {"id": "de1f72d9-9c92-40d9-aaf1-3b1d12142eba", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-317054403-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4714528fb7fb41eb908a9bda448bdffc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d6fab536-1e48-4d07-992a-076f0e6d089c", "external-id": "nsx-vlan-transportzone-61", "segmentation_id": 61, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcbe23c5f-78", "ovs_interfaceid": "cbe23c5f-783c-4a2c-9f5e-e7305fdcbea9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 784.224308] env[60400]: DEBUG oslo_concurrency.lockutils [req-bc805cd6-38ac-4b5e-b7af-50c84aa91294 req-c3f55302-8b6d-4097-b71d-62e6e1f081a0 service nova] Releasing lock "refresh_cache-7476fb96-5247-472c-ab92-ef7e5916cb00" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 818.934569] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 820.933394] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 820.933685] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 821.928590] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 821.933078] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 821.933223] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Starting heal instance info cache {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 821.933352] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Rebuilding the list of instances to heal {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 821.954228] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 821.954540] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 821.954540] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 821.954619] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 821.954734] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 821.954855] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 821.954976] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 821.955284] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 821.955430] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 821.955557] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 821.955708] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Didn't find any instances for network info cache update. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 821.956326] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 821.956498] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 821.965287] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 821.965494] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 821.965651] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 821.965840] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60400) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 821.966814] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9f5bbda-0e74-429d-83b5-6152ff763437 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 821.975796] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f65c55ce-e4e1-487a-ab08-c32cade3f3c4 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 821.990899] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54a6d7a6-35a6-4429-8b56-fb24df1d0031 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 821.997099] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1949fca-9688-4cff-926f-ea33faca173e {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 822.025928] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181795MB free_disk=118GB free_vcpus=48 pci_devices=None {{(pid=60400) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 822.025928] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 822.025928] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 822.091157] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 4540cd82-440c-41e3-8bfa-b384da6fc964 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 822.091349] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance a45f24ab-afe1-4ffd-a917-11b68a0b29ec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 822.091521] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 65bf8cf0-825c-42d8-bd78-62a6277d29d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 822.091586] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance cc1d534d-6a43-4575-895d-c3bef84d772e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 822.091762] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance e4f0342a-4169-40aa-b234-a2e2340d5b05 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 822.091900] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance f202a181-b5ea-4b06-91ad-86356b51e088 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 822.092027] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 63151ec9-f383-46cc-ac57-c3f7f1569410 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 822.092138] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance c5b391a9-7969-4119-9bc6-b0e1fe7a9713 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 822.092250] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 822.092361] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 7476fb96-5247-472c-ab92-ef7e5916cb00 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 822.103374] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 35630c7b-fdf4-4d6d-8e5a-0045f1387f93 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 822.113491] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 837197c0-9ff8-45a2-8bf0-730158a43a17 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 822.122941] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 56471a78-08cd-4d1a-b3f5-d1eac277183e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 822.132451] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 1240824e-c5f1-4517-b182-20245311c687 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 822.141398] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance daf1f034-cac9-44a9-8fdd-0c4c2d8eaa84 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 822.150221] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 822.159373] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 49aaf98b-945e-4c5d-8158-641b8650a8a7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 822.168488] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance cb7a8413-4414-4de6-8d4f-9ac4f1784f35 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 822.177262] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 01b62d6f-6718-45b4-8f67-cdb77c5f4bd0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 822.177505] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60400) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 822.177755] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60400) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 822.377033] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20eda5a3-ba3b-4c06-8a8a-76ecca6b5ede {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 822.384148] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32cc7d9a-db91-4a88-b3e0-ed47b9e2d9a2 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 822.414817] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ba89ebf-4e80-4214-bc4b-ff2649add199 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 822.422082] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8f700b6-b196-4d1a-afce-1a1b0b0123f0 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 822.434967] env[60400]: DEBUG nova.compute.provider_tree [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 822.443753] env[60400]: DEBUG nova.scheduler.client.report [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 822.458285] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60400) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 822.458452] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.433s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 823.434857] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 823.455445] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 823.455617] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 823.455744] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60400) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 826.379814] env[60400]: WARNING oslo_vmware.rw_handles [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 826.379814] env[60400]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 826.379814] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 826.379814] env[60400]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 826.379814] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 826.379814] env[60400]: ERROR oslo_vmware.rw_handles response.begin() [ 826.379814] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 826.379814] env[60400]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 826.379814] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 826.379814] env[60400]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 826.379814] env[60400]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 826.379814] env[60400]: ERROR oslo_vmware.rw_handles [ 826.380464] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Downloaded image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to vmware_temp/cea983d3-1f85-4946-82f1-6e4328a65e3b/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 826.382256] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Caching image {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 826.382537] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Copying Virtual Disk [datastore1] vmware_temp/cea983d3-1f85-4946-82f1-6e4328a65e3b/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk to [datastore1] vmware_temp/cea983d3-1f85-4946-82f1-6e4328a65e3b/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk {{(pid=60400) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 826.382844] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-cdbf9c3b-852e-41b8-8603-be26ae0fdbca {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 826.392042] env[60400]: DEBUG oslo_vmware.api [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Waiting for the task: (returnval){ [ 826.392042] env[60400]: value = "task-449810" [ 826.392042] env[60400]: _type = "Task" [ 826.392042] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 826.400300] env[60400]: DEBUG oslo_vmware.api [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Task: {'id': task-449810, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 826.902427] env[60400]: DEBUG oslo_vmware.exceptions [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Fault InvalidArgument not matched. {{(pid=60400) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 826.902686] env[60400]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 826.903270] env[60400]: ERROR nova.compute.manager [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 826.903270] env[60400]: Faults: ['InvalidArgument'] [ 826.903270] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Traceback (most recent call last): [ 826.903270] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 826.903270] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] yield resources [ 826.903270] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 826.903270] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] self.driver.spawn(context, instance, image_meta, [ 826.903270] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 826.903270] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] self._vmops.spawn(context, instance, image_meta, injected_files, [ 826.903270] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 826.903270] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] self._fetch_image_if_missing(context, vi) [ 826.903270] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 826.903637] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] image_cache(vi, tmp_image_ds_loc) [ 826.903637] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 826.903637] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] vm_util.copy_virtual_disk( [ 826.903637] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 826.903637] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] session._wait_for_task(vmdk_copy_task) [ 826.903637] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 826.903637] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] return self.wait_for_task(task_ref) [ 826.903637] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 826.903637] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] return evt.wait() [ 826.903637] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 826.903637] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] result = hub.switch() [ 826.903637] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 826.903637] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] return self.greenlet.switch() [ 826.904020] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 826.904020] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] self.f(*self.args, **self.kw) [ 826.904020] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 826.904020] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] raise exceptions.translate_fault(task_info.error) [ 826.904020] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 826.904020] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Faults: ['InvalidArgument'] [ 826.904020] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] [ 826.904020] env[60400]: INFO nova.compute.manager [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Terminating instance [ 826.905219] env[60400]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 826.905434] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 826.905686] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-43d5bb44-df3e-4723-bab1-0c3f059fc84f {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 826.907874] env[60400]: DEBUG nova.compute.manager [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Start destroying the instance on the hypervisor. {{(pid=60400) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 826.908072] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Destroying instance {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 826.908788] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bbc1af6a-a022-4e38-b296-3ec60218d3c9 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 826.915491] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Unregistering the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 826.915686] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ab3b718a-8165-4a2c-9767-d425524cf981 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 826.917759] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 826.917917] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60400) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 826.918815] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f8997f9e-eacd-44f1-9625-5bf90ea70340 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 826.923589] env[60400]: DEBUG oslo_vmware.api [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Waiting for the task: (returnval){ [ 826.923589] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]526fee53-54b6-0cd6-2e8e-6f591d33d6c4" [ 826.923589] env[60400]: _type = "Task" [ 826.923589] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 826.931579] env[60400]: DEBUG oslo_vmware.api [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]526fee53-54b6-0cd6-2e8e-6f591d33d6c4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 826.981822] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Unregistered the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 826.982229] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Deleting contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 826.982484] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Deleting the datastore file [datastore1] 4540cd82-440c-41e3-8bfa-b384da6fc964 {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 826.982723] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f5d0afb9-d047-4807-93dd-6822dec509f0 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 826.989017] env[60400]: DEBUG oslo_vmware.api [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Waiting for the task: (returnval){ [ 826.989017] env[60400]: value = "task-449812" [ 826.989017] env[60400]: _type = "Task" [ 826.989017] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 826.997602] env[60400]: DEBUG oslo_vmware.api [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Task: {'id': task-449812, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 827.434499] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Preparing fetch location {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 827.434853] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Creating directory with path [datastore1] vmware_temp/dba9b854-65e2-488b-a1f1-b2f30aaa1b23/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 827.435222] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cdaafc92-0273-42d2-b472-aad8e3d089de {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 827.447332] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Created directory with path [datastore1] vmware_temp/dba9b854-65e2-488b-a1f1-b2f30aaa1b23/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 827.447656] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Fetch image to [datastore1] vmware_temp/dba9b854-65e2-488b-a1f1-b2f30aaa1b23/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 827.447768] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to [datastore1] vmware_temp/dba9b854-65e2-488b-a1f1-b2f30aaa1b23/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 827.448518] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59dcb453-4cb9-4c0e-8607-319a1e8329d8 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 827.455319] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3fce0fad-8fc8-4ec9-bfcd-09aa64ac7535 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 827.464918] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fae506cb-0ffc-4f8b-96a4-c0f04f44409d {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 827.498910] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44b96f5a-e5a6-479c-8b76-00d09028efa3 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 827.506635] env[60400]: DEBUG oslo_vmware.api [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Task: {'id': task-449812, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068887} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 827.508133] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Deleted the datastore file {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 827.508323] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Deleted contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 827.508488] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Instance destroyed {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 827.508652] env[60400]: INFO nova.compute.manager [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Took 0.60 seconds to destroy the instance on the hypervisor. [ 827.510465] env[60400]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7a99aabc-6e7b-4a75-b994-449b7bf1e51b {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 827.512471] env[60400]: DEBUG nova.compute.claims [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Aborting claim: {{(pid=60400) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 827.512635] env[60400]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 827.512836] env[60400]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 827.537256] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 827.585546] env[60400]: DEBUG oslo_vmware.rw_handles [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/dba9b854-65e2-488b-a1f1-b2f30aaa1b23/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60400) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 827.644501] env[60400]: DEBUG oslo_vmware.rw_handles [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Completed reading data from the image iterator. {{(pid=60400) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 827.644688] env[60400]: DEBUG oslo_vmware.rw_handles [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/dba9b854-65e2-488b-a1f1-b2f30aaa1b23/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60400) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 827.830222] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78206d55-d26f-4df8-85de-33ea46238ebf {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 827.837660] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa2a9777-14d8-4476-8906-bb56fac25089 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 827.869085] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-191c245b-61b5-4a6f-82f7-690abeba4ae1 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 827.876541] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-399aac77-5598-4557-969a-3a27d6531b37 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 827.890473] env[60400]: DEBUG nova.compute.provider_tree [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 827.898765] env[60400]: DEBUG nova.scheduler.client.report [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 827.913134] env[60400]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.400s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 827.913655] env[60400]: ERROR nova.compute.manager [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 827.913655] env[60400]: Faults: ['InvalidArgument'] [ 827.913655] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Traceback (most recent call last): [ 827.913655] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 827.913655] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] self.driver.spawn(context, instance, image_meta, [ 827.913655] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 827.913655] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] self._vmops.spawn(context, instance, image_meta, injected_files, [ 827.913655] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 827.913655] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] self._fetch_image_if_missing(context, vi) [ 827.913655] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 827.913655] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] image_cache(vi, tmp_image_ds_loc) [ 827.913655] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 827.913988] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] vm_util.copy_virtual_disk( [ 827.913988] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 827.913988] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] session._wait_for_task(vmdk_copy_task) [ 827.913988] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 827.913988] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] return self.wait_for_task(task_ref) [ 827.913988] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 827.913988] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] return evt.wait() [ 827.913988] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 827.913988] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] result = hub.switch() [ 827.913988] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 827.913988] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] return self.greenlet.switch() [ 827.913988] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 827.913988] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] self.f(*self.args, **self.kw) [ 827.914424] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 827.914424] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] raise exceptions.translate_fault(task_info.error) [ 827.914424] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 827.914424] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Faults: ['InvalidArgument'] [ 827.914424] env[60400]: ERROR nova.compute.manager [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] [ 827.914424] env[60400]: DEBUG nova.compute.utils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] VimFaultException {{(pid=60400) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 827.915967] env[60400]: DEBUG nova.compute.manager [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Build of instance 4540cd82-440c-41e3-8bfa-b384da6fc964 was re-scheduled: A specified parameter was not correct: fileType [ 827.915967] env[60400]: Faults: ['InvalidArgument'] {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 827.916360] env[60400]: DEBUG nova.compute.manager [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Unplugging VIFs for instance {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 827.916531] env[60400]: DEBUG nova.compute.manager [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 827.916681] env[60400]: DEBUG nova.compute.manager [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Deallocating network for instance {{(pid=60400) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 827.916864] env[60400]: DEBUG nova.network.neutron [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] deallocate_for_instance() {{(pid=60400) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 828.131825] env[60400]: DEBUG nova.network.neutron [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Updating instance_info_cache with network_info: [] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 828.142905] env[60400]: INFO nova.compute.manager [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Took 0.23 seconds to deallocate network for instance. [ 828.226235] env[60400]: INFO nova.scheduler.client.report [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Deleted allocations for instance 4540cd82-440c-41e3-8bfa-b384da6fc964 [ 828.245008] env[60400]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Lock "4540cd82-440c-41e3-8bfa-b384da6fc964" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 197.023s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 828.258798] env[60400]: DEBUG nova.compute.manager [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Starting instance... {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 828.310256] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 828.310501] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 828.311962] env[60400]: INFO nova.compute.claims [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 828.574836] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a485200-0e86-4217-8127-908b4864e420 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 828.582571] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58096333-7564-42d1-a7f0-fca6f40e5828 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 828.613115] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec4f9011-9414-4b75-b828-485a9b50211b {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 828.621028] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f30f7e15-f417-4aac-ae93-8fd05a06d03c {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 828.634052] env[60400]: DEBUG nova.compute.provider_tree [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 828.641556] env[60400]: DEBUG nova.scheduler.client.report [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 828.660292] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.350s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 828.660865] env[60400]: DEBUG nova.compute.manager [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Start building networks asynchronously for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 828.692226] env[60400]: DEBUG nova.compute.utils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Using /dev/sd instead of None {{(pid=60400) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 828.695314] env[60400]: DEBUG nova.compute.manager [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Allocating IP information in the background. {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 828.695314] env[60400]: DEBUG nova.network.neutron [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] allocate_for_instance() {{(pid=60400) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 828.703646] env[60400]: DEBUG nova.compute.manager [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Start building block device mappings for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 828.764711] env[60400]: DEBUG nova.compute.manager [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Start spawning the instance on the hypervisor. {{(pid=60400) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 828.785791] env[60400]: DEBUG nova.virt.hardware [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T04:32:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T04:32:17Z,direct_url=,disk_format='vmdk',id=f5dfd970-7a56-4489-873c-2c3b6fbd9fe9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c82f07917ba4819a6bcf09e15f9f9cf',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T04:32:18Z,virtual_size=,visibility=), allow threads: False {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 828.786531] env[60400]: DEBUG nova.virt.hardware [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Flavor limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 828.786531] env[60400]: DEBUG nova.virt.hardware [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Image limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 828.786531] env[60400]: DEBUG nova.virt.hardware [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Flavor pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 828.786758] env[60400]: DEBUG nova.virt.hardware [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Image pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 828.786758] env[60400]: DEBUG nova.virt.hardware [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 828.786916] env[60400]: DEBUG nova.virt.hardware [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 828.787154] env[60400]: DEBUG nova.virt.hardware [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 828.787338] env[60400]: DEBUG nova.virt.hardware [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Got 1 possible topologies {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 828.787499] env[60400]: DEBUG nova.virt.hardware [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 828.787664] env[60400]: DEBUG nova.virt.hardware [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 828.788690] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fff23404-88db-4e90-b70f-683629a6bbb1 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 828.792568] env[60400]: DEBUG nova.policy [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '34ead03015734f3eb4679cfd446be51c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '64df375499704a52a28c8e3086612623', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60400) authorize /opt/stack/nova/nova/policy.py:203}} [ 828.799089] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a79f70cc-f2d3-4513-b562-a3eaaee616c6 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 829.276766] env[60400]: DEBUG nova.network.neutron [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Successfully created port: 25bf61e2-4397-46ff-abad-121b47570779 {{(pid=60400) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 830.438144] env[60400]: DEBUG nova.network.neutron [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Successfully updated port: 25bf61e2-4397-46ff-abad-121b47570779 {{(pid=60400) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 830.446012] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Acquiring lock "refresh_cache-35630c7b-fdf4-4d6d-8e5a-0045f1387f93" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 830.446162] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Acquired lock "refresh_cache-35630c7b-fdf4-4d6d-8e5a-0045f1387f93" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 830.446310] env[60400]: DEBUG nova.network.neutron [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Building network info cache for instance {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} [ 830.522616] env[60400]: DEBUG nova.network.neutron [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Instance cache missing network info. {{(pid=60400) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 830.697109] env[60400]: DEBUG nova.compute.manager [req-2ee149f3-c8b6-4494-9042-e82e46405278 req-33ed783f-fee2-4356-9025-1c29d2f7a9bc service nova] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Received event network-vif-plugged-25bf61e2-4397-46ff-abad-121b47570779 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 830.697301] env[60400]: DEBUG oslo_concurrency.lockutils [req-2ee149f3-c8b6-4494-9042-e82e46405278 req-33ed783f-fee2-4356-9025-1c29d2f7a9bc service nova] Acquiring lock "35630c7b-fdf4-4d6d-8e5a-0045f1387f93-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 830.697552] env[60400]: DEBUG oslo_concurrency.lockutils [req-2ee149f3-c8b6-4494-9042-e82e46405278 req-33ed783f-fee2-4356-9025-1c29d2f7a9bc service nova] Lock "35630c7b-fdf4-4d6d-8e5a-0045f1387f93-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 830.697722] env[60400]: DEBUG oslo_concurrency.lockutils [req-2ee149f3-c8b6-4494-9042-e82e46405278 req-33ed783f-fee2-4356-9025-1c29d2f7a9bc service nova] Lock "35630c7b-fdf4-4d6d-8e5a-0045f1387f93-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 830.697880] env[60400]: DEBUG nova.compute.manager [req-2ee149f3-c8b6-4494-9042-e82e46405278 req-33ed783f-fee2-4356-9025-1c29d2f7a9bc service nova] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] No waiting events found dispatching network-vif-plugged-25bf61e2-4397-46ff-abad-121b47570779 {{(pid=60400) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 830.698268] env[60400]: WARNING nova.compute.manager [req-2ee149f3-c8b6-4494-9042-e82e46405278 req-33ed783f-fee2-4356-9025-1c29d2f7a9bc service nova] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Received unexpected event network-vif-plugged-25bf61e2-4397-46ff-abad-121b47570779 for instance with vm_state building and task_state spawning. [ 830.925832] env[60400]: DEBUG nova.network.neutron [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Updating instance_info_cache with network_info: [{"id": "25bf61e2-4397-46ff-abad-121b47570779", "address": "fa:16:3e:fa:99:ff", "network": {"id": "968e8167-6c75-4fe7-87f5-4059d7324410", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-222734479-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "64df375499704a52a28c8e3086612623", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "86a35d07-53d3-46b3-92cb-ae34236c0f41", "external-id": "nsx-vlan-transportzone-811", "segmentation_id": 811, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap25bf61e2-43", "ovs_interfaceid": "25bf61e2-4397-46ff-abad-121b47570779", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 830.936112] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Releasing lock "refresh_cache-35630c7b-fdf4-4d6d-8e5a-0045f1387f93" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 830.936404] env[60400]: DEBUG nova.compute.manager [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Instance network_info: |[{"id": "25bf61e2-4397-46ff-abad-121b47570779", "address": "fa:16:3e:fa:99:ff", "network": {"id": "968e8167-6c75-4fe7-87f5-4059d7324410", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-222734479-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "64df375499704a52a28c8e3086612623", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "86a35d07-53d3-46b3-92cb-ae34236c0f41", "external-id": "nsx-vlan-transportzone-811", "segmentation_id": 811, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap25bf61e2-43", "ovs_interfaceid": "25bf61e2-4397-46ff-abad-121b47570779", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 830.936798] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:fa:99:ff', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '86a35d07-53d3-46b3-92cb-ae34236c0f41', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '25bf61e2-4397-46ff-abad-121b47570779', 'vif_model': 'vmxnet3'}] {{(pid=60400) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 830.945524] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Creating folder: Project (64df375499704a52a28c8e3086612623). Parent ref: group-v119075. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 830.946346] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-01ebaa64-f5ec-40bf-aac1-2ae4e08dc2f3 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 830.958253] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Created folder: Project (64df375499704a52a28c8e3086612623) in parent group-v119075. [ 830.958469] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Creating folder: Instances. Parent ref: group-v119119. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 830.959495] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-25dd08ff-813b-442c-a693-ac29a3044f79 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 830.968199] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Created folder: Instances in parent group-v119119. [ 830.969369] env[60400]: DEBUG oslo.service.loopingcall [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60400) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 830.969612] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Creating VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 830.969831] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6a967085-e3fb-4ca1-b335-6a269c8884f0 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 830.992673] env[60400]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 830.992673] env[60400]: value = "task-449815" [ 830.992673] env[60400]: _type = "Task" [ 830.992673] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 831.001018] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449815, 'name': CreateVM_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 831.298304] env[60400]: DEBUG oslo_concurrency.lockutils [None req-cc6d3dfe-f8ea-452b-89bf-32e56cdd5ca7 tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Acquiring lock "a45f24ab-afe1-4ffd-a917-11b68a0b29ec" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 831.504154] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449815, 'name': CreateVM_Task, 'duration_secs': 0.308457} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 831.504154] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Created VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 831.504557] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 831.504695] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 831.504983] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 831.505226] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6cfcbc89-0a25-41df-ba58-340884493990 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 831.510185] env[60400]: DEBUG oslo_vmware.api [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Waiting for the task: (returnval){ [ 831.510185] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]525cd1f1-0cc1-e6ad-2d4f-9c9fc02de023" [ 831.510185] env[60400]: _type = "Task" [ 831.510185] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 831.524412] env[60400]: DEBUG oslo_vmware.api [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]525cd1f1-0cc1-e6ad-2d4f-9c9fc02de023, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 832.021169] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 832.021426] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Processing image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 832.021643] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 832.810988] env[60400]: DEBUG nova.compute.manager [req-7c3162ab-4a56-4caf-85df-af84c2123184 req-c7d83d3d-54e2-4093-9030-af4c817471cc service nova] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Received event network-changed-25bf61e2-4397-46ff-abad-121b47570779 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 832.811230] env[60400]: DEBUG nova.compute.manager [req-7c3162ab-4a56-4caf-85df-af84c2123184 req-c7d83d3d-54e2-4093-9030-af4c817471cc service nova] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Refreshing instance network info cache due to event network-changed-25bf61e2-4397-46ff-abad-121b47570779. {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 832.811374] env[60400]: DEBUG oslo_concurrency.lockutils [req-7c3162ab-4a56-4caf-85df-af84c2123184 req-c7d83d3d-54e2-4093-9030-af4c817471cc service nova] Acquiring lock "refresh_cache-35630c7b-fdf4-4d6d-8e5a-0045f1387f93" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 832.811550] env[60400]: DEBUG oslo_concurrency.lockutils [req-7c3162ab-4a56-4caf-85df-af84c2123184 req-c7d83d3d-54e2-4093-9030-af4c817471cc service nova] Acquired lock "refresh_cache-35630c7b-fdf4-4d6d-8e5a-0045f1387f93" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 832.811653] env[60400]: DEBUG nova.network.neutron [req-7c3162ab-4a56-4caf-85df-af84c2123184 req-c7d83d3d-54e2-4093-9030-af4c817471cc service nova] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Refreshing network info cache for port 25bf61e2-4397-46ff-abad-121b47570779 {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} [ 832.869640] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f4f513fe-b892-48b1-8990-4093ad79658d tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Acquiring lock "65bf8cf0-825c-42d8-bd78-62a6277d29d7" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 833.212743] env[60400]: DEBUG oslo_concurrency.lockutils [None req-535b3fef-8a08-4a03-b175-5102731937ce tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Acquiring lock "cc1d534d-6a43-4575-895d-c3bef84d772e" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 833.275030] env[60400]: DEBUG nova.network.neutron [req-7c3162ab-4a56-4caf-85df-af84c2123184 req-c7d83d3d-54e2-4093-9030-af4c817471cc service nova] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Updated VIF entry in instance network info cache for port 25bf61e2-4397-46ff-abad-121b47570779. {{(pid=60400) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} [ 833.275030] env[60400]: DEBUG nova.network.neutron [req-7c3162ab-4a56-4caf-85df-af84c2123184 req-c7d83d3d-54e2-4093-9030-af4c817471cc service nova] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Updating instance_info_cache with network_info: [{"id": "25bf61e2-4397-46ff-abad-121b47570779", "address": "fa:16:3e:fa:99:ff", "network": {"id": "968e8167-6c75-4fe7-87f5-4059d7324410", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-222734479-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "64df375499704a52a28c8e3086612623", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "86a35d07-53d3-46b3-92cb-ae34236c0f41", "external-id": "nsx-vlan-transportzone-811", "segmentation_id": 811, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap25bf61e2-43", "ovs_interfaceid": "25bf61e2-4397-46ff-abad-121b47570779", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 833.284482] env[60400]: DEBUG oslo_concurrency.lockutils [req-7c3162ab-4a56-4caf-85df-af84c2123184 req-c7d83d3d-54e2-4093-9030-af4c817471cc service nova] Releasing lock "refresh_cache-35630c7b-fdf4-4d6d-8e5a-0045f1387f93" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 839.621340] env[60400]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Acquiring lock "19881c50-a8ff-411f-b570-d4dc9ef3b0dc" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 839.621638] env[60400]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Lock "19881c50-a8ff-411f-b570-d4dc9ef3b0dc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 845.634379] env[60400]: DEBUG oslo_concurrency.lockutils [None req-7d935374-e183-4a03-b795-fbe4cedc3ae5 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Acquiring lock "e4f0342a-4169-40aa-b234-a2e2340d5b05" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 877.122721] env[60400]: WARNING oslo_vmware.rw_handles [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 877.122721] env[60400]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 877.122721] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 877.122721] env[60400]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 877.122721] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 877.122721] env[60400]: ERROR oslo_vmware.rw_handles response.begin() [ 877.122721] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 877.122721] env[60400]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 877.122721] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 877.122721] env[60400]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 877.122721] env[60400]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 877.122721] env[60400]: ERROR oslo_vmware.rw_handles [ 877.123355] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Downloaded image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to vmware_temp/dba9b854-65e2-488b-a1f1-b2f30aaa1b23/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 877.124976] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Caching image {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 877.125233] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Copying Virtual Disk [datastore1] vmware_temp/dba9b854-65e2-488b-a1f1-b2f30aaa1b23/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk to [datastore1] vmware_temp/dba9b854-65e2-488b-a1f1-b2f30aaa1b23/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk {{(pid=60400) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 877.125538] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-de8b1c71-5ecc-4f36-8c9b-df35e3ab927a {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 877.134724] env[60400]: DEBUG oslo_vmware.api [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Waiting for the task: (returnval){ [ 877.134724] env[60400]: value = "task-449816" [ 877.134724] env[60400]: _type = "Task" [ 877.134724] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 877.143188] env[60400]: DEBUG oslo_vmware.api [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Task: {'id': task-449816, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 877.646046] env[60400]: DEBUG oslo_vmware.exceptions [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Fault InvalidArgument not matched. {{(pid=60400) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 877.646046] env[60400]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 877.646046] env[60400]: ERROR nova.compute.manager [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 877.646046] env[60400]: Faults: ['InvalidArgument'] [ 877.646046] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Traceback (most recent call last): [ 877.646046] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 877.646046] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] yield resources [ 877.646046] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 877.646046] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] self.driver.spawn(context, instance, image_meta, [ 877.646425] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 877.646425] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] self._vmops.spawn(context, instance, image_meta, injected_files, [ 877.646425] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 877.646425] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] self._fetch_image_if_missing(context, vi) [ 877.646425] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 877.646425] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] image_cache(vi, tmp_image_ds_loc) [ 877.646425] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 877.646425] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] vm_util.copy_virtual_disk( [ 877.646425] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 877.646425] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] session._wait_for_task(vmdk_copy_task) [ 877.646425] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 877.646425] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] return self.wait_for_task(task_ref) [ 877.646425] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 877.646755] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] return evt.wait() [ 877.646755] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 877.646755] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] result = hub.switch() [ 877.646755] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 877.646755] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] return self.greenlet.switch() [ 877.646755] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 877.646755] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] self.f(*self.args, **self.kw) [ 877.646755] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 877.646755] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] raise exceptions.translate_fault(task_info.error) [ 877.646755] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 877.646755] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Faults: ['InvalidArgument'] [ 877.646755] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] [ 877.646755] env[60400]: INFO nova.compute.manager [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Terminating instance [ 877.647729] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 877.647961] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 877.648197] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6f4e8f4e-d38e-4ca7-b783-5ae939855ea3 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 877.650273] env[60400]: DEBUG nova.compute.manager [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Start destroying the instance on the hypervisor. {{(pid=60400) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 877.650461] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Destroying instance {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 877.651186] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd660cd0-2280-43ec-b62e-3a8881966877 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 877.658848] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Unregistering the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 877.659042] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-1f5a6aef-f0ee-4b5f-a00d-048ad8b6cc53 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 877.661100] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 877.661262] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60400) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 877.662142] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6c036598-a2f7-4d0a-862a-3eb5ed2139e3 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 877.666483] env[60400]: DEBUG oslo_vmware.api [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Waiting for the task: (returnval){ [ 877.666483] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]5254f364-4f40-cce3-30e4-d83faabdbfa5" [ 877.666483] env[60400]: _type = "Task" [ 877.666483] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 877.673379] env[60400]: DEBUG oslo_vmware.api [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]5254f364-4f40-cce3-30e4-d83faabdbfa5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 877.730844] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Unregistered the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 877.731119] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Deleting contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 877.731246] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Deleting the datastore file [datastore1] a45f24ab-afe1-4ffd-a917-11b68a0b29ec {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 877.731507] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-76149eaa-3335-4c72-97dc-d11a722eb8a6 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 877.737950] env[60400]: DEBUG oslo_vmware.api [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Waiting for the task: (returnval){ [ 877.737950] env[60400]: value = "task-449818" [ 877.737950] env[60400]: _type = "Task" [ 877.737950] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 877.745592] env[60400]: DEBUG oslo_vmware.api [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Task: {'id': task-449818, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 878.176688] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Preparing fetch location {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 878.177032] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Creating directory with path [datastore1] vmware_temp/2439d4d9-f445-460d-bb70-25f355f24fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 878.177136] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-520fb481-3438-427c-b23d-95ce7026f5a1 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 878.188276] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Created directory with path [datastore1] vmware_temp/2439d4d9-f445-460d-bb70-25f355f24fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 878.188467] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Fetch image to [datastore1] vmware_temp/2439d4d9-f445-460d-bb70-25f355f24fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 878.188629] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to [datastore1] vmware_temp/2439d4d9-f445-460d-bb70-25f355f24fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 878.189690] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-968fa30e-a8b7-4857-af44-dc01078a3d76 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 878.196026] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf88cc1c-d253-425c-bfb3-037ac9221eb9 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 878.205626] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11ceecac-86ed-415c-b464-5e71bb53ae1b {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 878.235536] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47412b3b-8335-4056-8f23-52ea8cad07d2 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 878.242953] env[60400]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1f78eb16-76f3-4350-ae7e-bb0fab697299 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 878.247316] env[60400]: DEBUG oslo_vmware.api [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Task: {'id': task-449818, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.071763} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 878.247825] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Deleted the datastore file {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 878.248032] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Deleted contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 878.248189] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Instance destroyed {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 878.248357] env[60400]: INFO nova.compute.manager [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Took 0.60 seconds to destroy the instance on the hypervisor. [ 878.250474] env[60400]: DEBUG nova.compute.claims [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Aborting claim: {{(pid=60400) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 878.250625] env[60400]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 878.250842] env[60400]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 878.330552] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 878.376344] env[60400]: DEBUG oslo_vmware.rw_handles [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2439d4d9-f445-460d-bb70-25f355f24fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60400) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 878.434911] env[60400]: DEBUG oslo_vmware.rw_handles [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Completed reading data from the image iterator. {{(pid=60400) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 878.434911] env[60400]: DEBUG oslo_vmware.rw_handles [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2439d4d9-f445-460d-bb70-25f355f24fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60400) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 878.575499] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59c2a315-aa55-4e7b-988c-add54c9ef102 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 878.584312] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1232f9fd-7209-4756-98ce-8e0c607800b1 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 878.614437] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4d4dc77-ae29-4f66-86ff-ae93f86913a7 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 878.621463] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2020e25b-67b6-4591-8610-ed8a35f1bf1b {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 878.634473] env[60400]: DEBUG nova.compute.provider_tree [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 878.642286] env[60400]: DEBUG nova.scheduler.client.report [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 878.654935] env[60400]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.404s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 878.655464] env[60400]: ERROR nova.compute.manager [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 878.655464] env[60400]: Faults: ['InvalidArgument'] [ 878.655464] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Traceback (most recent call last): [ 878.655464] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 878.655464] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] self.driver.spawn(context, instance, image_meta, [ 878.655464] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 878.655464] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] self._vmops.spawn(context, instance, image_meta, injected_files, [ 878.655464] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 878.655464] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] self._fetch_image_if_missing(context, vi) [ 878.655464] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 878.655464] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] image_cache(vi, tmp_image_ds_loc) [ 878.655464] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 878.656009] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] vm_util.copy_virtual_disk( [ 878.656009] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 878.656009] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] session._wait_for_task(vmdk_copy_task) [ 878.656009] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 878.656009] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] return self.wait_for_task(task_ref) [ 878.656009] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 878.656009] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] return evt.wait() [ 878.656009] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 878.656009] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] result = hub.switch() [ 878.656009] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 878.656009] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] return self.greenlet.switch() [ 878.656009] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 878.656009] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] self.f(*self.args, **self.kw) [ 878.656443] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 878.656443] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] raise exceptions.translate_fault(task_info.error) [ 878.656443] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 878.656443] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Faults: ['InvalidArgument'] [ 878.656443] env[60400]: ERROR nova.compute.manager [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] [ 878.656443] env[60400]: DEBUG nova.compute.utils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] VimFaultException {{(pid=60400) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 878.657512] env[60400]: DEBUG nova.compute.manager [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Build of instance a45f24ab-afe1-4ffd-a917-11b68a0b29ec was re-scheduled: A specified parameter was not correct: fileType [ 878.657512] env[60400]: Faults: ['InvalidArgument'] {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 878.658435] env[60400]: DEBUG nova.compute.manager [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Unplugging VIFs for instance {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 878.658435] env[60400]: DEBUG nova.compute.manager [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 878.658435] env[60400]: DEBUG nova.compute.manager [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Deallocating network for instance {{(pid=60400) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 878.658435] env[60400]: DEBUG nova.network.neutron [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] deallocate_for_instance() {{(pid=60400) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 878.926933] env[60400]: DEBUG nova.network.neutron [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Updating instance_info_cache with network_info: [] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 878.932758] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 878.932966] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 878.933114] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Cleaning up deleted instances {{(pid=60400) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11095}} [ 878.936348] env[60400]: INFO nova.compute.manager [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Took 0.28 seconds to deallocate network for instance. [ 878.945111] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] There are 0 instances to clean {{(pid=60400) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11104}} [ 878.945322] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 878.945448] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Cleaning up deleted instances with incomplete migration {{(pid=60400) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11133}} [ 878.954738] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 879.025735] env[60400]: INFO nova.scheduler.client.report [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Deleted allocations for instance a45f24ab-afe1-4ffd-a917-11b68a0b29ec [ 879.041459] env[60400]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Lock "a45f24ab-afe1-4ffd-a917-11b68a0b29ec" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 247.193s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 879.042492] env[60400]: DEBUG oslo_concurrency.lockutils [None req-cc6d3dfe-f8ea-452b-89bf-32e56cdd5ca7 tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Lock "a45f24ab-afe1-4ffd-a917-11b68a0b29ec" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 47.744s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 879.042732] env[60400]: DEBUG oslo_concurrency.lockutils [None req-cc6d3dfe-f8ea-452b-89bf-32e56cdd5ca7 tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Acquiring lock "a45f24ab-afe1-4ffd-a917-11b68a0b29ec-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 879.042940] env[60400]: DEBUG oslo_concurrency.lockutils [None req-cc6d3dfe-f8ea-452b-89bf-32e56cdd5ca7 tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Lock "a45f24ab-afe1-4ffd-a917-11b68a0b29ec-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 879.043114] env[60400]: DEBUG oslo_concurrency.lockutils [None req-cc6d3dfe-f8ea-452b-89bf-32e56cdd5ca7 tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Lock "a45f24ab-afe1-4ffd-a917-11b68a0b29ec-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 879.045680] env[60400]: INFO nova.compute.manager [None req-cc6d3dfe-f8ea-452b-89bf-32e56cdd5ca7 tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Terminating instance [ 879.046967] env[60400]: DEBUG nova.compute.manager [None req-cc6d3dfe-f8ea-452b-89bf-32e56cdd5ca7 tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Start destroying the instance on the hypervisor. {{(pid=60400) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 879.047077] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-cc6d3dfe-f8ea-452b-89bf-32e56cdd5ca7 tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Destroying instance {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 879.048719] env[60400]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c748816c-fbef-4ec0-bd1b-3ca399868e40 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 879.058377] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b5d4d6e-c75f-4f39-8d9c-8016240c94a3 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 879.071348] env[60400]: DEBUG nova.compute.manager [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Starting instance... {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 879.090232] env[60400]: WARNING nova.virt.vmwareapi.vmops [None req-cc6d3dfe-f8ea-452b-89bf-32e56cdd5ca7 tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a45f24ab-afe1-4ffd-a917-11b68a0b29ec could not be found. [ 879.090403] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-cc6d3dfe-f8ea-452b-89bf-32e56cdd5ca7 tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Instance destroyed {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 879.090576] env[60400]: INFO nova.compute.manager [None req-cc6d3dfe-f8ea-452b-89bf-32e56cdd5ca7 tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Took 0.04 seconds to destroy the instance on the hypervisor. [ 879.090808] env[60400]: DEBUG oslo.service.loopingcall [None req-cc6d3dfe-f8ea-452b-89bf-32e56cdd5ca7 tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60400) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 879.091035] env[60400]: DEBUG nova.compute.manager [-] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Deallocating network for instance {{(pid=60400) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 879.091129] env[60400]: DEBUG nova.network.neutron [-] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] deallocate_for_instance() {{(pid=60400) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 879.120067] env[60400]: DEBUG nova.network.neutron [-] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Updating instance_info_cache with network_info: [] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 879.121629] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 879.121846] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 879.123314] env[60400]: INFO nova.compute.claims [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 879.128578] env[60400]: INFO nova.compute.manager [-] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Took 0.04 seconds to deallocate network for instance. [ 879.237989] env[60400]: DEBUG oslo_concurrency.lockutils [None req-cc6d3dfe-f8ea-452b-89bf-32e56cdd5ca7 tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Lock "a45f24ab-afe1-4ffd-a917-11b68a0b29ec" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.195s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 879.388967] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-193ed260-19be-409c-9294-b3a43098c4b0 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 879.396333] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6646ce11-89b1-46d6-b096-280bacb458d3 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 879.426517] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4b9a889-002f-4e5e-8e9d-22498267b140 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 879.433794] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6067abfa-fc74-4159-97ee-885ae18b6568 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 879.446716] env[60400]: DEBUG nova.compute.provider_tree [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 879.455627] env[60400]: DEBUG nova.scheduler.client.report [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 879.468462] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.347s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 879.468906] env[60400]: DEBUG nova.compute.manager [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Start building networks asynchronously for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 879.503013] env[60400]: DEBUG nova.compute.utils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Using /dev/sd instead of None {{(pid=60400) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 879.504673] env[60400]: DEBUG nova.compute.manager [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Allocating IP information in the background. {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 879.504905] env[60400]: DEBUG nova.network.neutron [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] allocate_for_instance() {{(pid=60400) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 879.512648] env[60400]: DEBUG nova.compute.manager [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Start building block device mappings for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 879.572108] env[60400]: DEBUG nova.policy [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '19d32e6d286a48099a4a6b39cc21e5c3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6cd208d1e842468ea334e506728ad9d4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60400) authorize /opt/stack/nova/nova/policy.py:203}} [ 879.576245] env[60400]: DEBUG nova.compute.manager [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Start spawning the instance on the hypervisor. {{(pid=60400) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 879.600452] env[60400]: DEBUG nova.virt.hardware [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T04:32:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T04:32:17Z,direct_url=,disk_format='vmdk',id=f5dfd970-7a56-4489-873c-2c3b6fbd9fe9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c82f07917ba4819a6bcf09e15f9f9cf',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T04:32:18Z,virtual_size=,visibility=), allow threads: False {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 879.600809] env[60400]: DEBUG nova.virt.hardware [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Flavor limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 879.601170] env[60400]: DEBUG nova.virt.hardware [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Image limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 879.601394] env[60400]: DEBUG nova.virt.hardware [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Flavor pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 879.601598] env[60400]: DEBUG nova.virt.hardware [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Image pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 879.601838] env[60400]: DEBUG nova.virt.hardware [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 879.602058] env[60400]: DEBUG nova.virt.hardware [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 879.602211] env[60400]: DEBUG nova.virt.hardware [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 879.602372] env[60400]: DEBUG nova.virt.hardware [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Got 1 possible topologies {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 879.602635] env[60400]: DEBUG nova.virt.hardware [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 879.603007] env[60400]: DEBUG nova.virt.hardware [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 879.603868] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26e1a526-82ac-4a19-ae84-d10acfd86b33 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 879.612824] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-502e5a2d-50b7-4c52-ab73-72f01836363c {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 880.061284] env[60400]: DEBUG nova.network.neutron [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Successfully created port: 2ec34df8-b677-4aeb-bf94-343f23356cb6 {{(pid=60400) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 880.968807] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 881.212022] env[60400]: DEBUG nova.network.neutron [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Successfully updated port: 2ec34df8-b677-4aeb-bf94-343f23356cb6 {{(pid=60400) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 881.223764] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Acquiring lock "refresh_cache-837197c0-9ff8-45a2-8bf0-730158a43a17" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 881.223764] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Acquired lock "refresh_cache-837197c0-9ff8-45a2-8bf0-730158a43a17" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 881.223764] env[60400]: DEBUG nova.network.neutron [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Building network info cache for instance {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} [ 881.284021] env[60400]: DEBUG nova.network.neutron [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Instance cache missing network info. {{(pid=60400) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 881.286826] env[60400]: DEBUG nova.compute.manager [req-1297dad9-1a0c-4cce-8a08-262c2acfa579 req-78792b7e-2d1d-466c-b70a-1a19438a1a83 service nova] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Received event network-vif-plugged-2ec34df8-b677-4aeb-bf94-343f23356cb6 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 881.287340] env[60400]: DEBUG oslo_concurrency.lockutils [req-1297dad9-1a0c-4cce-8a08-262c2acfa579 req-78792b7e-2d1d-466c-b70a-1a19438a1a83 service nova] Acquiring lock "837197c0-9ff8-45a2-8bf0-730158a43a17-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 881.287705] env[60400]: DEBUG oslo_concurrency.lockutils [req-1297dad9-1a0c-4cce-8a08-262c2acfa579 req-78792b7e-2d1d-466c-b70a-1a19438a1a83 service nova] Lock "837197c0-9ff8-45a2-8bf0-730158a43a17-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 881.287782] env[60400]: DEBUG oslo_concurrency.lockutils [req-1297dad9-1a0c-4cce-8a08-262c2acfa579 req-78792b7e-2d1d-466c-b70a-1a19438a1a83 service nova] Lock "837197c0-9ff8-45a2-8bf0-730158a43a17-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 881.287903] env[60400]: DEBUG nova.compute.manager [req-1297dad9-1a0c-4cce-8a08-262c2acfa579 req-78792b7e-2d1d-466c-b70a-1a19438a1a83 service nova] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] No waiting events found dispatching network-vif-plugged-2ec34df8-b677-4aeb-bf94-343f23356cb6 {{(pid=60400) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 881.288066] env[60400]: WARNING nova.compute.manager [req-1297dad9-1a0c-4cce-8a08-262c2acfa579 req-78792b7e-2d1d-466c-b70a-1a19438a1a83 service nova] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Received unexpected event network-vif-plugged-2ec34df8-b677-4aeb-bf94-343f23356cb6 for instance with vm_state building and task_state spawning. [ 881.566738] env[60400]: DEBUG nova.network.neutron [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Updating instance_info_cache with network_info: [{"id": "2ec34df8-b677-4aeb-bf94-343f23356cb6", "address": "fa:16:3e:e8:7a:9c", "network": {"id": "00a435a6-4ca4-4cbc-b89f-5d3230344438", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-336946882-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "6cd208d1e842468ea334e506728ad9d4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "13e83154-c0d2-4d3d-b95e-3cd5ba336257", "external-id": "nsx-vlan-transportzone-771", "segmentation_id": 771, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2ec34df8-b6", "ovs_interfaceid": "2ec34df8-b677-4aeb-bf94-343f23356cb6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 881.580146] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Releasing lock "refresh_cache-837197c0-9ff8-45a2-8bf0-730158a43a17" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 881.580442] env[60400]: DEBUG nova.compute.manager [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Instance network_info: |[{"id": "2ec34df8-b677-4aeb-bf94-343f23356cb6", "address": "fa:16:3e:e8:7a:9c", "network": {"id": "00a435a6-4ca4-4cbc-b89f-5d3230344438", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-336946882-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "6cd208d1e842468ea334e506728ad9d4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "13e83154-c0d2-4d3d-b95e-3cd5ba336257", "external-id": "nsx-vlan-transportzone-771", "segmentation_id": 771, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2ec34df8-b6", "ovs_interfaceid": "2ec34df8-b677-4aeb-bf94-343f23356cb6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 881.580871] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e8:7a:9c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '13e83154-c0d2-4d3d-b95e-3cd5ba336257', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2ec34df8-b677-4aeb-bf94-343f23356cb6', 'vif_model': 'vmxnet3'}] {{(pid=60400) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 881.588538] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Creating folder: Project (6cd208d1e842468ea334e506728ad9d4). Parent ref: group-v119075. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 881.589054] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d6d20db1-b881-4fd7-a114-ab78c0b99a13 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 881.601095] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Created folder: Project (6cd208d1e842468ea334e506728ad9d4) in parent group-v119075. [ 881.601316] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Creating folder: Instances. Parent ref: group-v119122. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 881.601536] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ea5dd7b3-4640-44ad-a2ad-b53ec1975c09 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 881.610122] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Created folder: Instances in parent group-v119122. [ 881.610342] env[60400]: DEBUG oslo.service.loopingcall [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60400) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 881.610509] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Creating VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 881.610720] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-88efaded-d422-4623-aa56-8fa7a088af27 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 881.629598] env[60400]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 881.629598] env[60400]: value = "task-449821" [ 881.629598] env[60400]: _type = "Task" [ 881.629598] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 881.638390] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449821, 'name': CreateVM_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 881.934635] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 882.138883] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449821, 'name': CreateVM_Task, 'duration_secs': 0.289515} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 882.139168] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Created VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 882.139752] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 882.139849] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 882.140172] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 882.140395] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-abf22151-db7a-4c24-bca6-0ce235935af3 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 882.144501] env[60400]: DEBUG oslo_vmware.api [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Waiting for the task: (returnval){ [ 882.144501] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52be7e92-a05d-4a15-731d-dce1b7d1f00a" [ 882.144501] env[60400]: _type = "Task" [ 882.144501] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 882.151493] env[60400]: DEBUG oslo_vmware.api [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52be7e92-a05d-4a15-731d-dce1b7d1f00a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 882.655034] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 882.655303] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Processing image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 882.655507] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 882.928381] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 882.933035] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 882.933207] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Starting heal instance info cache {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 882.933328] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Rebuilding the list of instances to heal {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 882.956540] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 882.956823] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 882.957089] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 882.957332] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 882.957573] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 882.957795] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 882.958023] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 882.958253] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 882.958488] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 882.958714] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 882.958936] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Didn't find any instances for network info cache update. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 882.959625] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 883.725080] env[60400]: DEBUG nova.compute.manager [req-41dbc8f3-ce0a-4015-8be8-222611e2cf87 req-e3bf2165-8f7a-40d5-b203-b554ededeff9 service nova] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Received event network-changed-2ec34df8-b677-4aeb-bf94-343f23356cb6 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 883.725397] env[60400]: DEBUG nova.compute.manager [req-41dbc8f3-ce0a-4015-8be8-222611e2cf87 req-e3bf2165-8f7a-40d5-b203-b554ededeff9 service nova] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Refreshing instance network info cache due to event network-changed-2ec34df8-b677-4aeb-bf94-343f23356cb6. {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 883.725466] env[60400]: DEBUG oslo_concurrency.lockutils [req-41dbc8f3-ce0a-4015-8be8-222611e2cf87 req-e3bf2165-8f7a-40d5-b203-b554ededeff9 service nova] Acquiring lock "refresh_cache-837197c0-9ff8-45a2-8bf0-730158a43a17" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 883.725597] env[60400]: DEBUG oslo_concurrency.lockutils [req-41dbc8f3-ce0a-4015-8be8-222611e2cf87 req-e3bf2165-8f7a-40d5-b203-b554ededeff9 service nova] Acquired lock "refresh_cache-837197c0-9ff8-45a2-8bf0-730158a43a17" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 883.726321] env[60400]: DEBUG nova.network.neutron [req-41dbc8f3-ce0a-4015-8be8-222611e2cf87 req-e3bf2165-8f7a-40d5-b203-b554ededeff9 service nova] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Refreshing network info cache for port 2ec34df8-b677-4aeb-bf94-343f23356cb6 {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} [ 883.932560] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 883.932811] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 883.942157] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 883.942427] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 883.942596] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 883.942747] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60400) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 883.943846] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa406ebe-23d9-4e26-afe6-bb026610f9d9 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 883.952966] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55c8c178-cf1d-4eba-8d24-317ee6bdbafc {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 883.967475] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4dc31233-66b5-4047-99f1-669743ee9a79 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 883.973986] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7ed573e-106f-499d-9c3d-c3708c3ce6d2 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 884.004932] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181778MB free_disk=118GB free_vcpus=48 pci_devices=None {{(pid=60400) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 884.005101] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 884.005293] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 884.138621] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 65bf8cf0-825c-42d8-bd78-62a6277d29d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 884.138763] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance cc1d534d-6a43-4575-895d-c3bef84d772e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 884.138858] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance e4f0342a-4169-40aa-b234-a2e2340d5b05 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 884.138926] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance f202a181-b5ea-4b06-91ad-86356b51e088 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 884.139121] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 63151ec9-f383-46cc-ac57-c3f7f1569410 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 884.139234] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance c5b391a9-7969-4119-9bc6-b0e1fe7a9713 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 884.139300] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 884.139426] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 7476fb96-5247-472c-ab92-ef7e5916cb00 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 884.139544] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 35630c7b-fdf4-4d6d-8e5a-0045f1387f93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 884.139677] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 837197c0-9ff8-45a2-8bf0-730158a43a17 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 884.155206] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 56471a78-08cd-4d1a-b3f5-d1eac277183e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 884.169988] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 1240824e-c5f1-4517-b182-20245311c687 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 884.186895] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance daf1f034-cac9-44a9-8fdd-0c4c2d8eaa84 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 884.201605] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 884.217137] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 49aaf98b-945e-4c5d-8158-641b8650a8a7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 884.227511] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance cb7a8413-4414-4de6-8d4f-9ac4f1784f35 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 884.239616] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 01b62d6f-6718-45b4-8f67-cdb77c5f4bd0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 884.249989] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 19881c50-a8ff-411f-b570-d4dc9ef3b0dc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 884.250236] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60400) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 884.250377] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60400) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 884.268323] env[60400]: DEBUG nova.scheduler.client.report [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Refreshing inventories for resource provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 884.283939] env[60400]: DEBUG nova.scheduler.client.report [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Updating ProviderTree inventory for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 884.284130] env[60400]: DEBUG nova.compute.provider_tree [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Updating inventory in ProviderTree for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 884.302021] env[60400]: DEBUG nova.scheduler.client.report [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Refreshing aggregate associations for resource provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc, aggregates: None {{(pid=60400) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 884.325235] env[60400]: DEBUG nova.scheduler.client.report [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Refreshing trait associations for resource provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO {{(pid=60400) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 884.476692] env[60400]: DEBUG nova.network.neutron [req-41dbc8f3-ce0a-4015-8be8-222611e2cf87 req-e3bf2165-8f7a-40d5-b203-b554ededeff9 service nova] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Updated VIF entry in instance network info cache for port 2ec34df8-b677-4aeb-bf94-343f23356cb6. {{(pid=60400) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} [ 884.477062] env[60400]: DEBUG nova.network.neutron [req-41dbc8f3-ce0a-4015-8be8-222611e2cf87 req-e3bf2165-8f7a-40d5-b203-b554ededeff9 service nova] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Updating instance_info_cache with network_info: [{"id": "2ec34df8-b677-4aeb-bf94-343f23356cb6", "address": "fa:16:3e:e8:7a:9c", "network": {"id": "00a435a6-4ca4-4cbc-b89f-5d3230344438", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-336946882-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "6cd208d1e842468ea334e506728ad9d4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "13e83154-c0d2-4d3d-b95e-3cd5ba336257", "external-id": "nsx-vlan-transportzone-771", "segmentation_id": 771, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2ec34df8-b6", "ovs_interfaceid": "2ec34df8-b677-4aeb-bf94-343f23356cb6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 884.489943] env[60400]: DEBUG oslo_concurrency.lockutils [req-41dbc8f3-ce0a-4015-8be8-222611e2cf87 req-e3bf2165-8f7a-40d5-b203-b554ededeff9 service nova] Releasing lock "refresh_cache-837197c0-9ff8-45a2-8bf0-730158a43a17" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 884.567168] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1651e53-dcb3-44eb-b80b-0b4785eeca0c {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 884.575259] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7688e58-f85b-4914-9fab-ddb6f44d716f {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 884.613569] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11fed584-50a2-4a77-9e50-03edcd9ab242 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 884.621148] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a7adb73-6386-4d16-972c-3e4bae9ce2a6 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 884.635321] env[60400]: DEBUG nova.compute.provider_tree [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 884.643941] env[60400]: DEBUG nova.scheduler.client.report [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 884.656731] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60400) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 884.656996] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.652s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 885.657592] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 885.657866] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60400) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 886.918175] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Acquiring lock "0257c136-6f30-43ae-8f8d-e8f23d8328ef" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 886.918475] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Lock "0257c136-6f30-43ae-8f8d-e8f23d8328ef" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 907.574160] env[60400]: DEBUG nova.compute.manager [req-e67b9617-cdd6-4153-b3b4-5faf74efa8ae req-a61ee12b-610c-4f14-99c5-368f3991758e service nova] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Received event network-vif-deleted-fb944cf8-2052-40f8-ac94-fa2beac376d5 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 909.981402] env[60400]: DEBUG nova.compute.manager [req-5eca9e06-5967-4dd4-87ed-f2482c796935 req-3b8e4ac3-2415-4b99-a3f5-9fa4b79d9d4a service nova] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Received event network-vif-deleted-7f9ba268-7959-4d15-8d3d-f7a6c33a0287 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 914.644842] env[60400]: DEBUG nova.compute.manager [req-cbe15468-0050-4a68-999d-ae154521289a req-0d0948ab-e5f3-4969-8ecf-5ffbae3b6a12 service nova] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Received event network-vif-deleted-777d624e-2007-42dc-b553-d6efc26d590f {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 917.008181] env[60400]: DEBUG nova.compute.manager [req-4fc24e0c-de9d-46da-8715-9d1c6348c95b req-e2537c1c-7adb-472c-b2cc-a5368e286796 service nova] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Received event network-vif-deleted-cbe23c5f-783c-4a2c-9f5e-e7305fdcbea9 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 917.008553] env[60400]: DEBUG nova.compute.manager [req-4fc24e0c-de9d-46da-8715-9d1c6348c95b req-e2537c1c-7adb-472c-b2cc-a5368e286796 service nova] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Received event network-vif-deleted-6cf68b2a-cbef-4cdf-9893-d28ee3add61e {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 919.040726] env[60400]: DEBUG nova.compute.manager [req-b3033dea-1fb8-4f2a-abd7-a5f86f7047f1 req-42429f69-0f50-483b-8a2b-475152940930 service nova] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Received event network-vif-deleted-2ec34df8-b677-4aeb-bf94-343f23356cb6 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 919.041522] env[60400]: DEBUG nova.compute.manager [req-b3033dea-1fb8-4f2a-abd7-a5f86f7047f1 req-42429f69-0f50-483b-8a2b-475152940930 service nova] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Received event network-vif-deleted-25bf61e2-4397-46ff-abad-121b47570779 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 925.335112] env[60400]: WARNING oslo_vmware.rw_handles [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 925.335112] env[60400]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 925.335112] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 925.335112] env[60400]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 925.335112] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 925.335112] env[60400]: ERROR oslo_vmware.rw_handles response.begin() [ 925.335112] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 925.335112] env[60400]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 925.335112] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 925.335112] env[60400]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 925.335112] env[60400]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 925.335112] env[60400]: ERROR oslo_vmware.rw_handles [ 925.336324] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Downloaded image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to vmware_temp/2439d4d9-f445-460d-bb70-25f355f24fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 925.338467] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Caching image {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 925.338816] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Copying Virtual Disk [datastore1] vmware_temp/2439d4d9-f445-460d-bb70-25f355f24fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk to [datastore1] vmware_temp/2439d4d9-f445-460d-bb70-25f355f24fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk {{(pid=60400) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 925.339405] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-eb77f55e-dfdf-45e8-b801-f675c71d0f65 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 925.348767] env[60400]: DEBUG oslo_vmware.api [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Waiting for the task: (returnval){ [ 925.348767] env[60400]: value = "task-449822" [ 925.348767] env[60400]: _type = "Task" [ 925.348767] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 925.357977] env[60400]: DEBUG oslo_vmware.api [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Task: {'id': task-449822, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 925.867941] env[60400]: DEBUG oslo_vmware.exceptions [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Fault InvalidArgument not matched. {{(pid=60400) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 925.868228] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 925.868794] env[60400]: ERROR nova.compute.manager [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 925.868794] env[60400]: Faults: ['InvalidArgument'] [ 925.868794] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Traceback (most recent call last): [ 925.868794] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 925.868794] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] yield resources [ 925.868794] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 925.868794] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] self.driver.spawn(context, instance, image_meta, [ 925.868794] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 925.868794] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 925.868794] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 925.868794] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] self._fetch_image_if_missing(context, vi) [ 925.868794] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 925.869174] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] image_cache(vi, tmp_image_ds_loc) [ 925.869174] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 925.869174] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] vm_util.copy_virtual_disk( [ 925.869174] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 925.869174] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] session._wait_for_task(vmdk_copy_task) [ 925.869174] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 925.869174] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] return self.wait_for_task(task_ref) [ 925.869174] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 925.869174] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] return evt.wait() [ 925.869174] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 925.869174] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] result = hub.switch() [ 925.869174] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 925.869174] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] return self.greenlet.switch() [ 925.869902] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 925.869902] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] self.f(*self.args, **self.kw) [ 925.869902] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 925.869902] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] raise exceptions.translate_fault(task_info.error) [ 925.869902] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 925.869902] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Faults: ['InvalidArgument'] [ 925.869902] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] [ 925.869902] env[60400]: INFO nova.compute.manager [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Terminating instance [ 925.871265] env[60400]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 925.871398] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 925.872015] env[60400]: DEBUG nova.compute.manager [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Start destroying the instance on the hypervisor. {{(pid=60400) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 925.872238] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Destroying instance {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 925.872454] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-98d7bd69-3c6a-49d4-bde4-ee60eaa144f7 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 925.875256] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5a555d7-1d93-4cbd-9875-8a0ae529a729 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 925.885627] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Unregistering the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 925.887149] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-7b9201df-21c8-49e1-8971-863ada5fc14a {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 925.891167] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 925.893526] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60400) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 925.893526] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d8de8825-f201-4c89-a08e-f3d87a23a046 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 925.897783] env[60400]: DEBUG oslo_vmware.api [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Waiting for the task: (returnval){ [ 925.897783] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52e6fa47-34ab-e3b3-5376-6b8961b5f82a" [ 925.897783] env[60400]: _type = "Task" [ 925.897783] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 925.906613] env[60400]: DEBUG oslo_vmware.api [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52e6fa47-34ab-e3b3-5376-6b8961b5f82a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 926.416387] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Preparing fetch location {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 926.416668] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Creating directory with path [datastore1] vmware_temp/7f1d2707-7738-4d3b-8958-63126ee0c241/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 926.416932] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b996c667-7a4e-42be-a2b5-b4e626459a41 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 926.442502] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Created directory with path [datastore1] vmware_temp/7f1d2707-7738-4d3b-8958-63126ee0c241/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 926.443027] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Fetch image to [datastore1] vmware_temp/7f1d2707-7738-4d3b-8958-63126ee0c241/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 926.443027] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to [datastore1] vmware_temp/7f1d2707-7738-4d3b-8958-63126ee0c241/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 926.443641] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1348d09b-6d6c-44e4-82ea-28f9043156c3 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 926.457123] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8751441b-f434-49b4-89f1-367802cacf20 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 926.466157] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca0ad166-0f28-4ca0-aedd-4e2341b4b632 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 926.504457] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d64c9c72-a39e-4c19-8418-db058b55e1a3 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 926.516207] env[60400]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-487b1424-225b-4f8f-98de-e31fd597ba16 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 926.538582] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 926.639153] env[60400]: DEBUG oslo_vmware.rw_handles [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7f1d2707-7738-4d3b-8958-63126ee0c241/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60400) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 926.702855] env[60400]: DEBUG oslo_vmware.rw_handles [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Completed reading data from the image iterator. {{(pid=60400) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 926.702988] env[60400]: DEBUG oslo_vmware.rw_handles [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7f1d2707-7738-4d3b-8958-63126ee0c241/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60400) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 927.287997] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Unregistered the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 927.288241] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Deleting contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 927.291075] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Deleting the datastore file [datastore1] cc1d534d-6a43-4575-895d-c3bef84d772e {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 927.291075] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-fbc93ab9-cdc2-41ff-beff-8b5a1a524d59 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 927.298662] env[60400]: DEBUG oslo_vmware.api [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Waiting for the task: (returnval){ [ 927.298662] env[60400]: value = "task-449824" [ 927.298662] env[60400]: _type = "Task" [ 927.298662] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 927.308917] env[60400]: DEBUG oslo_vmware.api [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Task: {'id': task-449824, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 927.809314] env[60400]: DEBUG oslo_vmware.api [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Task: {'id': task-449824, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.071393} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 927.809314] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Deleted the datastore file {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 927.809314] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Deleted contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 927.809728] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Instance destroyed {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 927.809728] env[60400]: INFO nova.compute.manager [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Took 1.94 seconds to destroy the instance on the hypervisor. [ 927.812963] env[60400]: DEBUG nova.compute.claims [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Aborting claim: {{(pid=60400) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 927.813122] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 927.813327] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 928.007857] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f35c160d-8e2d-4f71-922d-65dd3fbaac83 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 928.017236] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bd345c7-2bea-4850-adfc-f00a7436ed9e {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 928.051867] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b083e7d-33dd-4f65-b1c8-317fff0a2789 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 928.060315] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0132f74a-05f9-4002-81c5-8ef2fd6588a5 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 928.078307] env[60400]: DEBUG nova.compute.provider_tree [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 928.090608] env[60400]: DEBUG nova.scheduler.client.report [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 928.110340] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.297s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 928.111072] env[60400]: ERROR nova.compute.manager [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 928.111072] env[60400]: Faults: ['InvalidArgument'] [ 928.111072] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Traceback (most recent call last): [ 928.111072] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 928.111072] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] self.driver.spawn(context, instance, image_meta, [ 928.111072] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 928.111072] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 928.111072] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 928.111072] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] self._fetch_image_if_missing(context, vi) [ 928.111072] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 928.111072] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] image_cache(vi, tmp_image_ds_loc) [ 928.111072] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 928.111726] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] vm_util.copy_virtual_disk( [ 928.111726] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 928.111726] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] session._wait_for_task(vmdk_copy_task) [ 928.111726] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 928.111726] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] return self.wait_for_task(task_ref) [ 928.111726] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 928.111726] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] return evt.wait() [ 928.111726] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 928.111726] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] result = hub.switch() [ 928.111726] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 928.111726] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] return self.greenlet.switch() [ 928.111726] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 928.111726] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] self.f(*self.args, **self.kw) [ 928.112374] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 928.112374] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] raise exceptions.translate_fault(task_info.error) [ 928.112374] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 928.112374] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Faults: ['InvalidArgument'] [ 928.112374] env[60400]: ERROR nova.compute.manager [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] [ 928.112374] env[60400]: DEBUG nova.compute.utils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] VimFaultException {{(pid=60400) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 928.113147] env[60400]: DEBUG nova.compute.manager [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Build of instance cc1d534d-6a43-4575-895d-c3bef84d772e was re-scheduled: A specified parameter was not correct: fileType [ 928.113147] env[60400]: Faults: ['InvalidArgument'] {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 928.113514] env[60400]: DEBUG nova.compute.manager [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Unplugging VIFs for instance {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 928.114440] env[60400]: DEBUG nova.compute.manager [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 928.114440] env[60400]: DEBUG nova.compute.manager [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Deallocating network for instance {{(pid=60400) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 928.114440] env[60400]: DEBUG nova.network.neutron [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] deallocate_for_instance() {{(pid=60400) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 928.480319] env[60400]: DEBUG nova.network.neutron [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Updating instance_info_cache with network_info: [] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 928.493959] env[60400]: INFO nova.compute.manager [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Took 0.38 seconds to deallocate network for instance. [ 928.614870] env[60400]: INFO nova.scheduler.client.report [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Deleted allocations for instance cc1d534d-6a43-4575-895d-c3bef84d772e [ 928.630943] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Lock "cc1d534d-6a43-4575-895d-c3bef84d772e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 294.219s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 928.632062] env[60400]: DEBUG oslo_concurrency.lockutils [None req-535b3fef-8a08-4a03-b175-5102731937ce tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Lock "cc1d534d-6a43-4575-895d-c3bef84d772e" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 95.420s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 928.632279] env[60400]: DEBUG oslo_concurrency.lockutils [None req-535b3fef-8a08-4a03-b175-5102731937ce tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Acquiring lock "cc1d534d-6a43-4575-895d-c3bef84d772e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 928.632477] env[60400]: DEBUG oslo_concurrency.lockutils [None req-535b3fef-8a08-4a03-b175-5102731937ce tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Lock "cc1d534d-6a43-4575-895d-c3bef84d772e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 928.632635] env[60400]: DEBUG oslo_concurrency.lockutils [None req-535b3fef-8a08-4a03-b175-5102731937ce tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Lock "cc1d534d-6a43-4575-895d-c3bef84d772e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 928.634601] env[60400]: INFO nova.compute.manager [None req-535b3fef-8a08-4a03-b175-5102731937ce tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Terminating instance [ 928.636543] env[60400]: DEBUG nova.compute.manager [None req-535b3fef-8a08-4a03-b175-5102731937ce tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Start destroying the instance on the hypervisor. {{(pid=60400) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 928.636726] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-535b3fef-8a08-4a03-b175-5102731937ce tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Destroying instance {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 928.637255] env[60400]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-3fd95e1a-9fa1-4b10-b33d-1c026c4079e3 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 928.648284] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7de230f2-3548-4060-a5d7-616bfa95c504 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 928.659428] env[60400]: DEBUG nova.compute.manager [None req-94fe6431-1bb6-476e-9e72-7a5409337850 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] [instance: 56471a78-08cd-4d1a-b3f5-d1eac277183e] Starting instance... {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 928.678605] env[60400]: WARNING nova.virt.vmwareapi.vmops [None req-535b3fef-8a08-4a03-b175-5102731937ce tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance cc1d534d-6a43-4575-895d-c3bef84d772e could not be found. [ 928.678605] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-535b3fef-8a08-4a03-b175-5102731937ce tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Instance destroyed {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 928.678837] env[60400]: INFO nova.compute.manager [None req-535b3fef-8a08-4a03-b175-5102731937ce tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Took 0.04 seconds to destroy the instance on the hypervisor. [ 928.679087] env[60400]: DEBUG oslo.service.loopingcall [None req-535b3fef-8a08-4a03-b175-5102731937ce tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60400) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 928.679297] env[60400]: DEBUG nova.compute.manager [-] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Deallocating network for instance {{(pid=60400) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 928.679390] env[60400]: DEBUG nova.network.neutron [-] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] deallocate_for_instance() {{(pid=60400) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 928.684482] env[60400]: DEBUG nova.compute.manager [None req-94fe6431-1bb6-476e-9e72-7a5409337850 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] [instance: 56471a78-08cd-4d1a-b3f5-d1eac277183e] Instance disappeared before build. {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 928.702167] env[60400]: DEBUG nova.network.neutron [-] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Updating instance_info_cache with network_info: [] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 928.706413] env[60400]: DEBUG oslo_concurrency.lockutils [None req-94fe6431-1bb6-476e-9e72-7a5409337850 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "56471a78-08cd-4d1a-b3f5-d1eac277183e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 207.496s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 928.710548] env[60400]: INFO nova.compute.manager [-] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Took 0.03 seconds to deallocate network for instance. [ 928.723167] env[60400]: DEBUG nova.compute.manager [None req-d74c6efc-892d-47d0-bc6e-18c31d9337a2 tempest-ServerDiagnosticsV248Test-746765673 tempest-ServerDiagnosticsV248Test-746765673-project-member] [instance: 1240824e-c5f1-4517-b182-20245311c687] Starting instance... {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 928.752622] env[60400]: DEBUG nova.compute.manager [None req-d74c6efc-892d-47d0-bc6e-18c31d9337a2 tempest-ServerDiagnosticsV248Test-746765673 tempest-ServerDiagnosticsV248Test-746765673-project-member] [instance: 1240824e-c5f1-4517-b182-20245311c687] Instance disappeared before build. {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 928.784263] env[60400]: DEBUG oslo_concurrency.lockutils [None req-d74c6efc-892d-47d0-bc6e-18c31d9337a2 tempest-ServerDiagnosticsV248Test-746765673 tempest-ServerDiagnosticsV248Test-746765673-project-member] Lock "1240824e-c5f1-4517-b182-20245311c687" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 203.110s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 928.793123] env[60400]: DEBUG nova.compute.manager [None req-b4c28495-2ae6-4055-b784-b74f117db807 tempest-ServersV294TestFqdnHostnames-541059978 tempest-ServersV294TestFqdnHostnames-541059978-project-member] [instance: daf1f034-cac9-44a9-8fdd-0c4c2d8eaa84] Starting instance... {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 928.830979] env[60400]: DEBUG oslo_concurrency.lockutils [None req-535b3fef-8a08-4a03-b175-5102731937ce tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Lock "cc1d534d-6a43-4575-895d-c3bef84d772e" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.199s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 928.835053] env[60400]: DEBUG nova.compute.manager [None req-b4c28495-2ae6-4055-b784-b74f117db807 tempest-ServersV294TestFqdnHostnames-541059978 tempest-ServersV294TestFqdnHostnames-541059978-project-member] [instance: daf1f034-cac9-44a9-8fdd-0c4c2d8eaa84] Instance disappeared before build. {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 928.862344] env[60400]: DEBUG oslo_concurrency.lockutils [None req-b4c28495-2ae6-4055-b784-b74f117db807 tempest-ServersV294TestFqdnHostnames-541059978 tempest-ServersV294TestFqdnHostnames-541059978-project-member] Lock "daf1f034-cac9-44a9-8fdd-0c4c2d8eaa84" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 197.498s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 928.874205] env[60400]: DEBUG nova.compute.manager [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Starting instance... {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 928.928221] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 928.928474] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 928.930009] env[60400]: INFO nova.compute.claims [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 929.085271] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acdcbff7-63d4-422c-bdd4-361cbb28eacf {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 929.094203] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1cdf8c85-379d-4100-8760-309740b4967e {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 929.124943] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-997b7a1d-994d-4fef-91c0-c37a32cc1cba {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 929.133187] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db06c7e4-1ef3-4581-8a00-6d500d79534c {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 929.147774] env[60400]: DEBUG nova.compute.provider_tree [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 929.159693] env[60400]: DEBUG nova.scheduler.client.report [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 929.182687] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.251s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 929.182687] env[60400]: DEBUG nova.compute.manager [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Start building networks asynchronously for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 929.217069] env[60400]: DEBUG nova.compute.utils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Using /dev/sd instead of None {{(pid=60400) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 929.218516] env[60400]: DEBUG nova.compute.manager [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Allocating IP information in the background. {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 929.221053] env[60400]: DEBUG nova.network.neutron [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] allocate_for_instance() {{(pid=60400) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 929.237157] env[60400]: DEBUG nova.compute.manager [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Start building block device mappings for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 929.308476] env[60400]: DEBUG nova.policy [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '584528a6112f44d0b8c78739e0573670', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b3917aae625b4cfd9a0ab45ad226a4cb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60400) authorize /opt/stack/nova/nova/policy.py:203}} [ 929.326359] env[60400]: DEBUG nova.compute.manager [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Start spawning the instance on the hypervisor. {{(pid=60400) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 929.350116] env[60400]: DEBUG nova.virt.hardware [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T04:32:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T04:32:17Z,direct_url=,disk_format='vmdk',id=f5dfd970-7a56-4489-873c-2c3b6fbd9fe9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c82f07917ba4819a6bcf09e15f9f9cf',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T04:32:18Z,virtual_size=,visibility=), allow threads: False {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 929.350116] env[60400]: DEBUG nova.virt.hardware [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Flavor limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 929.350116] env[60400]: DEBUG nova.virt.hardware [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Image limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 929.350282] env[60400]: DEBUG nova.virt.hardware [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Flavor pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 929.350282] env[60400]: DEBUG nova.virt.hardware [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Image pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 929.350282] env[60400]: DEBUG nova.virt.hardware [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 929.350282] env[60400]: DEBUG nova.virt.hardware [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 929.350282] env[60400]: DEBUG nova.virt.hardware [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 929.350446] env[60400]: DEBUG nova.virt.hardware [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Got 1 possible topologies {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 929.350767] env[60400]: DEBUG nova.virt.hardware [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 929.351079] env[60400]: DEBUG nova.virt.hardware [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 929.352239] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b02b597a-d5ed-4096-acb4-58e95f8df172 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 929.360784] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54b60d68-d21f-4c40-9972-7371d66a965f {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 930.205550] env[60400]: DEBUG nova.network.neutron [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Successfully created port: 8ffabf8c-a600-4aac-882d-f90ba4b11a79 {{(pid=60400) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 931.783079] env[60400]: DEBUG nova.compute.manager [req-4fda7b22-afc8-4746-b399-87ab63f6ceb0 req-19333aa9-c6b9-40dd-9530-3233d0b21876 service nova] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Received event network-vif-plugged-8ffabf8c-a600-4aac-882d-f90ba4b11a79 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 931.783079] env[60400]: DEBUG oslo_concurrency.lockutils [req-4fda7b22-afc8-4746-b399-87ab63f6ceb0 req-19333aa9-c6b9-40dd-9530-3233d0b21876 service nova] Acquiring lock "b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 931.783079] env[60400]: DEBUG oslo_concurrency.lockutils [req-4fda7b22-afc8-4746-b399-87ab63f6ceb0 req-19333aa9-c6b9-40dd-9530-3233d0b21876 service nova] Lock "b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 931.784324] env[60400]: DEBUG oslo_concurrency.lockutils [req-4fda7b22-afc8-4746-b399-87ab63f6ceb0 req-19333aa9-c6b9-40dd-9530-3233d0b21876 service nova] Lock "b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 931.784656] env[60400]: DEBUG nova.compute.manager [req-4fda7b22-afc8-4746-b399-87ab63f6ceb0 req-19333aa9-c6b9-40dd-9530-3233d0b21876 service nova] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] No waiting events found dispatching network-vif-plugged-8ffabf8c-a600-4aac-882d-f90ba4b11a79 {{(pid=60400) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 931.785438] env[60400]: WARNING nova.compute.manager [req-4fda7b22-afc8-4746-b399-87ab63f6ceb0 req-19333aa9-c6b9-40dd-9530-3233d0b21876 service nova] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Received unexpected event network-vif-plugged-8ffabf8c-a600-4aac-882d-f90ba4b11a79 for instance with vm_state building and task_state spawning. [ 931.816362] env[60400]: DEBUG nova.network.neutron [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Successfully updated port: 8ffabf8c-a600-4aac-882d-f90ba4b11a79 {{(pid=60400) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 931.829562] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Acquiring lock "refresh_cache-b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 931.829562] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Acquired lock "refresh_cache-b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 931.829562] env[60400]: DEBUG nova.network.neutron [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Building network info cache for instance {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} [ 932.096381] env[60400]: DEBUG nova.network.neutron [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Instance cache missing network info. {{(pid=60400) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 932.487798] env[60400]: DEBUG nova.network.neutron [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Updating instance_info_cache with network_info: [{"id": "8ffabf8c-a600-4aac-882d-f90ba4b11a79", "address": "fa:16:3e:67:20:10", "network": {"id": "a8e0284d-da67-4aa5-b342-df06f125b35f", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1146102785-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b3917aae625b4cfd9a0ab45ad226a4cb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "db1f7867-8524-469c-ab47-d2c9e2751d98", "external-id": "nsx-vlan-transportzone-130", "segmentation_id": 130, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8ffabf8c-a6", "ovs_interfaceid": "8ffabf8c-a600-4aac-882d-f90ba4b11a79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 932.503103] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Releasing lock "refresh_cache-b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 932.503411] env[60400]: DEBUG nova.compute.manager [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Instance network_info: |[{"id": "8ffabf8c-a600-4aac-882d-f90ba4b11a79", "address": "fa:16:3e:67:20:10", "network": {"id": "a8e0284d-da67-4aa5-b342-df06f125b35f", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1146102785-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b3917aae625b4cfd9a0ab45ad226a4cb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "db1f7867-8524-469c-ab47-d2c9e2751d98", "external-id": "nsx-vlan-transportzone-130", "segmentation_id": 130, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8ffabf8c-a6", "ovs_interfaceid": "8ffabf8c-a600-4aac-882d-f90ba4b11a79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 932.503769] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:67:20:10', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'db1f7867-8524-469c-ab47-d2c9e2751d98', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8ffabf8c-a600-4aac-882d-f90ba4b11a79', 'vif_model': 'vmxnet3'}] {{(pid=60400) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 932.515788] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Creating folder: Project (b3917aae625b4cfd9a0ab45ad226a4cb). Parent ref: group-v119075. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 932.516442] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-49a67618-8877-4eff-a992-1019b759e484 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 932.529457] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Created folder: Project (b3917aae625b4cfd9a0ab45ad226a4cb) in parent group-v119075. [ 932.529708] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Creating folder: Instances. Parent ref: group-v119125. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 932.529925] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b57323f4-3ab1-4185-ae09-e4356065ce51 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 932.540302] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Created folder: Instances in parent group-v119125. [ 932.540524] env[60400]: DEBUG oslo.service.loopingcall [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60400) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 932.540734] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Creating VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 932.540888] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-431d1e73-168f-4f97-835a-dda5191859b8 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 932.561876] env[60400]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 932.561876] env[60400]: value = "task-449827" [ 932.561876] env[60400]: _type = "Task" [ 932.561876] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 932.569602] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449827, 'name': CreateVM_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 933.074463] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449827, 'name': CreateVM_Task, 'duration_secs': 0.34165} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 933.078279] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Created VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 933.078279] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 933.078279] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 933.078279] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 933.078279] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-179b5770-c5e8-48f5-997e-edd5512df9cc {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 933.082571] env[60400]: DEBUG oslo_vmware.api [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Waiting for the task: (returnval){ [ 933.082571] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52079422-f49e-05a7-1a25-ae66631f333e" [ 933.082571] env[60400]: _type = "Task" [ 933.082571] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 933.092536] env[60400]: DEBUG oslo_vmware.api [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52079422-f49e-05a7-1a25-ae66631f333e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 933.595256] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 933.596143] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Processing image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 933.596464] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 933.901469] env[60400]: DEBUG nova.compute.manager [req-19f9fef9-83c4-4478-a5d1-6b52feacbcf3 req-7db49f1a-31b2-44de-bbaa-0d1fda682d7a service nova] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Received event network-changed-8ffabf8c-a600-4aac-882d-f90ba4b11a79 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 933.901866] env[60400]: DEBUG nova.compute.manager [req-19f9fef9-83c4-4478-a5d1-6b52feacbcf3 req-7db49f1a-31b2-44de-bbaa-0d1fda682d7a service nova] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Refreshing instance network info cache due to event network-changed-8ffabf8c-a600-4aac-882d-f90ba4b11a79. {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 933.902257] env[60400]: DEBUG oslo_concurrency.lockutils [req-19f9fef9-83c4-4478-a5d1-6b52feacbcf3 req-7db49f1a-31b2-44de-bbaa-0d1fda682d7a service nova] Acquiring lock "refresh_cache-b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 933.902465] env[60400]: DEBUG oslo_concurrency.lockutils [req-19f9fef9-83c4-4478-a5d1-6b52feacbcf3 req-7db49f1a-31b2-44de-bbaa-0d1fda682d7a service nova] Acquired lock "refresh_cache-b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 933.902685] env[60400]: DEBUG nova.network.neutron [req-19f9fef9-83c4-4478-a5d1-6b52feacbcf3 req-7db49f1a-31b2-44de-bbaa-0d1fda682d7a service nova] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Refreshing network info cache for port 8ffabf8c-a600-4aac-882d-f90ba4b11a79 {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} [ 933.934496] env[60400]: DEBUG nova.network.neutron [req-19f9fef9-83c4-4478-a5d1-6b52feacbcf3 req-7db49f1a-31b2-44de-bbaa-0d1fda682d7a service nova] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Instance cache missing network info. {{(pid=60400) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 934.521769] env[60400]: DEBUG nova.network.neutron [req-19f9fef9-83c4-4478-a5d1-6b52feacbcf3 req-7db49f1a-31b2-44de-bbaa-0d1fda682d7a service nova] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Instance is deleted, no further info cache update {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:106}} [ 934.521992] env[60400]: DEBUG oslo_concurrency.lockutils [req-19f9fef9-83c4-4478-a5d1-6b52feacbcf3 req-7db49f1a-31b2-44de-bbaa-0d1fda682d7a service nova] Releasing lock "refresh_cache-b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 934.522176] env[60400]: DEBUG nova.compute.manager [req-19f9fef9-83c4-4478-a5d1-6b52feacbcf3 req-7db49f1a-31b2-44de-bbaa-0d1fda682d7a service nova] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Received event network-vif-deleted-8ffabf8c-a600-4aac-882d-f90ba4b11a79 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 939.934148] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 941.933545] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 942.928562] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 942.948639] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 942.948856] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Starting heal instance info cache {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 942.950204] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Rebuilding the list of instances to heal {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 942.971201] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 942.971334] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 942.971374] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Didn't find any instances for network info cache update. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 942.972034] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 943.971215] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 944.933349] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 944.933578] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 944.933729] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 944.933871] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60400) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 945.933018] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 945.944587] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 945.945461] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 945.945461] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 945.945461] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60400) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 945.946782] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9ff1b36-fe18-411c-b4e9-df71661d0177 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.955921] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9a7359a-3637-4a4d-a4b6-1ff33d2d47da {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.971141] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61ae3fa8-7440-4ec8-a913-bf5abe6a3c77 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.981225] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27df1ef0-0e84-4a82-8c89-a147a3fc9cd4 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.014824] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181791MB free_disk=118GB free_vcpus=48 pci_devices=None {{(pid=60400) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 946.014982] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 946.015230] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 946.059080] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 65bf8cf0-825c-42d8-bd78-62a6277d29d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 946.059235] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance e4f0342a-4169-40aa-b234-a2e2340d5b05 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 946.070976] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 19881c50-a8ff-411f-b570-d4dc9ef3b0dc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 946.082481] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 0257c136-6f30-43ae-8f8d-e8f23d8328ef has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 946.082680] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Total usable vcpus: 48, total allocated vcpus: 2 {{(pid=60400) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 946.082822] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=768MB phys_disk=200GB used_disk=2GB total_vcpus=48 used_vcpus=2 pci_stats=[] {{(pid=60400) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 946.146567] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e995e97a-52ee-4a4f-b2c6-945422aea590 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.157080] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63c5ef5e-90ed-455c-80ec-6c81cc7544e6 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.185754] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-007d1530-0f25-4df5-ad87-144cc3ef7d60 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.194018] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60f50ba0-971a-4afc-8ad5-411671827719 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.205810] env[60400]: DEBUG nova.compute.provider_tree [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 946.216605] env[60400]: DEBUG nova.scheduler.client.report [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 946.229490] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60400) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 946.229728] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.214s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 949.127432] env[60400]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Acquiring lock "b5ad6145-8bf0-4aed-951b-eb11dd87ed7d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 949.127779] env[60400]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "b5ad6145-8bf0-4aed-951b-eb11dd87ed7d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 951.638676] env[60400]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquiring lock "c6ee7d41-5522-4019-9da9-8503ec99e2b5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 951.638914] env[60400]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "c6ee7d41-5522-4019-9da9-8503ec99e2b5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 951.732245] env[60400]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquiring lock "d97a55c5-f248-482a-9986-212e84bdd0b0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 951.732450] env[60400]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "d97a55c5-f248-482a-9986-212e84bdd0b0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 963.543240] env[60400]: DEBUG oslo_concurrency.lockutils [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Acquiring lock "311eb356-b844-4b1b-a0f0-ed7da6bb9f1d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 963.543549] env[60400]: DEBUG oslo_concurrency.lockutils [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Lock "311eb356-b844-4b1b-a0f0-ed7da6bb9f1d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 975.355686] env[60400]: WARNING oslo_vmware.rw_handles [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 975.355686] env[60400]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 975.355686] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 975.355686] env[60400]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 975.355686] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 975.355686] env[60400]: ERROR oslo_vmware.rw_handles response.begin() [ 975.355686] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 975.355686] env[60400]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 975.355686] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 975.355686] env[60400]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 975.355686] env[60400]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 975.355686] env[60400]: ERROR oslo_vmware.rw_handles [ 975.356452] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Downloaded image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to vmware_temp/7f1d2707-7738-4d3b-8958-63126ee0c241/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 975.357725] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Caching image {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 975.357977] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Copying Virtual Disk [datastore1] vmware_temp/7f1d2707-7738-4d3b-8958-63126ee0c241/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk to [datastore1] vmware_temp/7f1d2707-7738-4d3b-8958-63126ee0c241/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk {{(pid=60400) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 975.358288] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-cdab8022-920a-48be-ad7e-5571c91d7e2b {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 975.367220] env[60400]: DEBUG oslo_vmware.api [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Waiting for the task: (returnval){ [ 975.367220] env[60400]: value = "task-449838" [ 975.367220] env[60400]: _type = "Task" [ 975.367220] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 975.375901] env[60400]: DEBUG oslo_vmware.api [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Task: {'id': task-449838, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 975.878424] env[60400]: DEBUG oslo_vmware.exceptions [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Fault InvalidArgument not matched. {{(pid=60400) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 975.878673] env[60400]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 975.879231] env[60400]: ERROR nova.compute.manager [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 975.879231] env[60400]: Faults: ['InvalidArgument'] [ 975.879231] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Traceback (most recent call last): [ 975.879231] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 975.879231] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] yield resources [ 975.879231] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 975.879231] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] self.driver.spawn(context, instance, image_meta, [ 975.879231] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 975.879231] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 975.879231] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 975.879231] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] self._fetch_image_if_missing(context, vi) [ 975.879231] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 975.879519] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] image_cache(vi, tmp_image_ds_loc) [ 975.879519] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 975.879519] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] vm_util.copy_virtual_disk( [ 975.879519] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 975.879519] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] session._wait_for_task(vmdk_copy_task) [ 975.879519] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 975.879519] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] return self.wait_for_task(task_ref) [ 975.879519] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 975.879519] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] return evt.wait() [ 975.879519] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 975.879519] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] result = hub.switch() [ 975.879519] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 975.879519] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] return self.greenlet.switch() [ 975.879898] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 975.879898] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] self.f(*self.args, **self.kw) [ 975.879898] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 975.879898] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] raise exceptions.translate_fault(task_info.error) [ 975.879898] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 975.879898] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Faults: ['InvalidArgument'] [ 975.879898] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] [ 975.879898] env[60400]: INFO nova.compute.manager [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Terminating instance [ 975.881442] env[60400]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 975.881442] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 975.881576] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ae6c4bbb-4509-49b6-8a15-3deb917950f8 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 975.883940] env[60400]: DEBUG nova.compute.manager [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Start destroying the instance on the hypervisor. {{(pid=60400) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 975.884147] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Destroying instance {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 975.884851] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff47e353-2ae6-4d4a-b8cf-e5f344fd1404 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 975.892147] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Unregistering the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 975.892349] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-033d338f-cc3e-4126-a130-6347c46bdafb {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 975.894722] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 975.894892] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60400) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 975.895823] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ebbde755-2b87-4d30-9652-58070c25ab62 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 975.901112] env[60400]: DEBUG oslo_vmware.api [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Waiting for the task: (returnval){ [ 975.901112] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52170f29-5f3c-ba6f-613c-374d910ba6c0" [ 975.901112] env[60400]: _type = "Task" [ 975.901112] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 975.908630] env[60400]: DEBUG oslo_vmware.api [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52170f29-5f3c-ba6f-613c-374d910ba6c0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 975.961072] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Unregistered the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 975.961258] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Deleting contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 975.961394] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Deleting the datastore file [datastore1] 65bf8cf0-825c-42d8-bd78-62a6277d29d7 {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 975.961645] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-140cfd9d-50d2-4a1b-a639-18344dd42464 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 975.969917] env[60400]: DEBUG oslo_vmware.api [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Waiting for the task: (returnval){ [ 975.969917] env[60400]: value = "task-449840" [ 975.969917] env[60400]: _type = "Task" [ 975.969917] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 975.977905] env[60400]: DEBUG oslo_vmware.api [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Task: {'id': task-449840, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 976.412113] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Preparing fetch location {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 976.412467] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Creating directory with path [datastore1] vmware_temp/98b05e84-31a8-4490-8187-4d6902b34205/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 976.412587] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1b17ec42-faa1-431c-a108-63654bfb9b0e {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 976.424362] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Created directory with path [datastore1] vmware_temp/98b05e84-31a8-4490-8187-4d6902b34205/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 976.424588] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Fetch image to [datastore1] vmware_temp/98b05e84-31a8-4490-8187-4d6902b34205/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 976.424689] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to [datastore1] vmware_temp/98b05e84-31a8-4490-8187-4d6902b34205/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 976.425402] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5510ba76-08c9-42d6-ba7c-d9a4502e6275 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 976.432170] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e046d91-4e3d-4d29-b936-3ce752fe5ae2 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 976.441341] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00bd9ff0-3475-43df-a181-7bb55795b7ad {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 976.475172] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ca053cc-1ec0-4fa1-b9c4-76b9141b1f46 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 976.483553] env[60400]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0b2e363b-2d6c-44a3-833d-fe97ba0a7a78 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 976.485195] env[60400]: DEBUG oslo_vmware.api [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Task: {'id': task-449840, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.070013} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 976.485413] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Deleted the datastore file {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 976.485582] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Deleted contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 976.485742] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Instance destroyed {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 976.485908] env[60400]: INFO nova.compute.manager [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Took 0.60 seconds to destroy the instance on the hypervisor. [ 976.487972] env[60400]: DEBUG nova.compute.claims [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Aborting claim: {{(pid=60400) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 976.488149] env[60400]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 976.488350] env[60400]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 976.511722] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 976.558602] env[60400]: DEBUG oslo_vmware.rw_handles [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/98b05e84-31a8-4490-8187-4d6902b34205/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60400) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 976.615145] env[60400]: DEBUG oslo_vmware.rw_handles [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Completed reading data from the image iterator. {{(pid=60400) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 976.615329] env[60400]: DEBUG oslo_vmware.rw_handles [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/98b05e84-31a8-4490-8187-4d6902b34205/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60400) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 976.662474] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0aa650f6-53f2-49b2-a880-7e1dfa2ac2e7 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 976.670129] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0fc9ddf-9a67-4fc2-b247-0d6433a9b607 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 976.700813] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a684729-5d7a-4823-b835-77f881e779b0 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 976.708057] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da6b5326-c10d-425f-8c86-71954e8cd0a7 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 976.720981] env[60400]: DEBUG nova.compute.provider_tree [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 976.730739] env[60400]: DEBUG nova.scheduler.client.report [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 976.743392] env[60400]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.255s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 976.743895] env[60400]: ERROR nova.compute.manager [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 976.743895] env[60400]: Faults: ['InvalidArgument'] [ 976.743895] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Traceback (most recent call last): [ 976.743895] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 976.743895] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] self.driver.spawn(context, instance, image_meta, [ 976.743895] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 976.743895] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 976.743895] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 976.743895] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] self._fetch_image_if_missing(context, vi) [ 976.743895] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 976.743895] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] image_cache(vi, tmp_image_ds_loc) [ 976.743895] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 976.744247] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] vm_util.copy_virtual_disk( [ 976.744247] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 976.744247] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] session._wait_for_task(vmdk_copy_task) [ 976.744247] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 976.744247] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] return self.wait_for_task(task_ref) [ 976.744247] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 976.744247] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] return evt.wait() [ 976.744247] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 976.744247] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] result = hub.switch() [ 976.744247] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 976.744247] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] return self.greenlet.switch() [ 976.744247] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 976.744247] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] self.f(*self.args, **self.kw) [ 976.744694] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 976.744694] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] raise exceptions.translate_fault(task_info.error) [ 976.744694] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 976.744694] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Faults: ['InvalidArgument'] [ 976.744694] env[60400]: ERROR nova.compute.manager [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] [ 976.744694] env[60400]: DEBUG nova.compute.utils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] VimFaultException {{(pid=60400) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 976.745924] env[60400]: DEBUG nova.compute.manager [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Build of instance 65bf8cf0-825c-42d8-bd78-62a6277d29d7 was re-scheduled: A specified parameter was not correct: fileType [ 976.745924] env[60400]: Faults: ['InvalidArgument'] {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 976.746288] env[60400]: DEBUG nova.compute.manager [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Unplugging VIFs for instance {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 976.746453] env[60400]: DEBUG nova.compute.manager [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 976.746599] env[60400]: DEBUG nova.compute.manager [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Deallocating network for instance {{(pid=60400) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 976.746752] env[60400]: DEBUG nova.network.neutron [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] deallocate_for_instance() {{(pid=60400) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 977.214536] env[60400]: DEBUG nova.network.neutron [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Updating instance_info_cache with network_info: [] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 977.224557] env[60400]: INFO nova.compute.manager [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Took 0.48 seconds to deallocate network for instance. [ 977.317616] env[60400]: INFO nova.scheduler.client.report [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Deleted allocations for instance 65bf8cf0-825c-42d8-bd78-62a6277d29d7 [ 977.336031] env[60400]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Lock "65bf8cf0-825c-42d8-bd78-62a6277d29d7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 343.578s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 977.337110] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f4f513fe-b892-48b1-8990-4093ad79658d tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Lock "65bf8cf0-825c-42d8-bd78-62a6277d29d7" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 144.468s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 977.337333] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f4f513fe-b892-48b1-8990-4093ad79658d tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Acquiring lock "65bf8cf0-825c-42d8-bd78-62a6277d29d7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 977.337573] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f4f513fe-b892-48b1-8990-4093ad79658d tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Lock "65bf8cf0-825c-42d8-bd78-62a6277d29d7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 977.337760] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f4f513fe-b892-48b1-8990-4093ad79658d tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Lock "65bf8cf0-825c-42d8-bd78-62a6277d29d7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 977.340449] env[60400]: INFO nova.compute.manager [None req-f4f513fe-b892-48b1-8990-4093ad79658d tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Terminating instance [ 977.342404] env[60400]: DEBUG nova.compute.manager [None req-f4f513fe-b892-48b1-8990-4093ad79658d tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Start destroying the instance on the hypervisor. {{(pid=60400) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 977.342537] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-f4f513fe-b892-48b1-8990-4093ad79658d tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Destroying instance {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 977.342963] env[60400]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-2d8dc553-7675-4089-a81a-1be9dbec03e6 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 977.353708] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e0589d8-e987-4efa-a2ac-b2004cc05796 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 977.364990] env[60400]: DEBUG nova.compute.manager [None req-ef4b706f-4629-44f3-be39-af08c87a8497 tempest-ImagesOneServerTestJSON-749254656 tempest-ImagesOneServerTestJSON-749254656-project-member] [instance: 49aaf98b-945e-4c5d-8158-641b8650a8a7] Starting instance... {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 977.386273] env[60400]: WARNING nova.virt.vmwareapi.vmops [None req-f4f513fe-b892-48b1-8990-4093ad79658d tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 65bf8cf0-825c-42d8-bd78-62a6277d29d7 could not be found. [ 977.386502] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-f4f513fe-b892-48b1-8990-4093ad79658d tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Instance destroyed {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 977.386678] env[60400]: INFO nova.compute.manager [None req-f4f513fe-b892-48b1-8990-4093ad79658d tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Took 0.04 seconds to destroy the instance on the hypervisor. [ 977.386910] env[60400]: DEBUG oslo.service.loopingcall [None req-f4f513fe-b892-48b1-8990-4093ad79658d tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60400) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 977.387243] env[60400]: DEBUG nova.compute.manager [-] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Deallocating network for instance {{(pid=60400) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 977.387343] env[60400]: DEBUG nova.network.neutron [-] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] deallocate_for_instance() {{(pid=60400) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 977.411894] env[60400]: DEBUG nova.network.neutron [-] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Updating instance_info_cache with network_info: [] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 977.413456] env[60400]: DEBUG nova.compute.manager [None req-ef4b706f-4629-44f3-be39-af08c87a8497 tempest-ImagesOneServerTestJSON-749254656 tempest-ImagesOneServerTestJSON-749254656-project-member] [instance: 49aaf98b-945e-4c5d-8158-641b8650a8a7] Instance disappeared before build. {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 977.419238] env[60400]: INFO nova.compute.manager [-] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Took 0.03 seconds to deallocate network for instance. [ 977.436210] env[60400]: DEBUG oslo_concurrency.lockutils [None req-ef4b706f-4629-44f3-be39-af08c87a8497 tempest-ImagesOneServerTestJSON-749254656 tempest-ImagesOneServerTestJSON-749254656-project-member] Lock "49aaf98b-945e-4c5d-8158-641b8650a8a7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 241.532s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 977.450210] env[60400]: DEBUG nova.compute.manager [None req-bb0f1347-ba84-4ee0-b6e1-a08a8a353154 tempest-ServerTagsTestJSON-1401197038 tempest-ServerTagsTestJSON-1401197038-project-member] [instance: cb7a8413-4414-4de6-8d4f-9ac4f1784f35] Starting instance... {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 977.476688] env[60400]: DEBUG nova.compute.manager [None req-bb0f1347-ba84-4ee0-b6e1-a08a8a353154 tempest-ServerTagsTestJSON-1401197038 tempest-ServerTagsTestJSON-1401197038-project-member] [instance: cb7a8413-4414-4de6-8d4f-9ac4f1784f35] Instance disappeared before build. {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 977.497462] env[60400]: DEBUG oslo_concurrency.lockutils [None req-bb0f1347-ba84-4ee0-b6e1-a08a8a353154 tempest-ServerTagsTestJSON-1401197038 tempest-ServerTagsTestJSON-1401197038-project-member] Lock "cb7a8413-4414-4de6-8d4f-9ac4f1784f35" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 240.103s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 977.504116] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f4f513fe-b892-48b1-8990-4093ad79658d tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Lock "65bf8cf0-825c-42d8-bd78-62a6277d29d7" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.167s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 977.509453] env[60400]: DEBUG nova.compute.manager [None req-f6e0da48-d31d-4d57-a3a8-7b964135e1c7 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] [instance: 01b62d6f-6718-45b4-8f67-cdb77c5f4bd0] Starting instance... {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 977.531631] env[60400]: DEBUG nova.compute.manager [None req-f6e0da48-d31d-4d57-a3a8-7b964135e1c7 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] [instance: 01b62d6f-6718-45b4-8f67-cdb77c5f4bd0] Instance disappeared before build. {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 977.551260] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f6e0da48-d31d-4d57-a3a8-7b964135e1c7 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Lock "01b62d6f-6718-45b4-8f67-cdb77c5f4bd0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 236.477s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 977.559231] env[60400]: DEBUG nova.compute.manager [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Starting instance... {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 977.605826] env[60400]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 977.606078] env[60400]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 977.607630] env[60400]: INFO nova.compute.claims [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 977.728971] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a5d45ef-a68f-4447-863c-230fd729fe12 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 977.737094] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f05430e1-4822-4628-8da8-e1c71f30e22d {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 977.766114] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e96aa611-8478-48da-b9f4-455bb3dba0dc {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 977.773211] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abe8c6bd-ff44-4e84-94bf-9e78a5859efc {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 977.787071] env[60400]: DEBUG nova.compute.provider_tree [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 977.794897] env[60400]: DEBUG nova.scheduler.client.report [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 977.808887] env[60400]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.203s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 977.809331] env[60400]: DEBUG nova.compute.manager [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Start building networks asynchronously for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 977.839127] env[60400]: DEBUG nova.compute.utils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Using /dev/sd instead of None {{(pid=60400) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 977.840430] env[60400]: DEBUG nova.compute.manager [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Allocating IP information in the background. {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 977.840597] env[60400]: DEBUG nova.network.neutron [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] allocate_for_instance() {{(pid=60400) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 977.851511] env[60400]: DEBUG nova.compute.manager [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Start building block device mappings for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 977.905224] env[60400]: DEBUG nova.policy [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3a776640ecd74bf6b1f54f2a84c1f44b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bf64f2b352e24fe39bc883bbca0e091e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60400) authorize /opt/stack/nova/nova/policy.py:203}} [ 977.913209] env[60400]: DEBUG nova.compute.manager [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Start spawning the instance on the hypervisor. {{(pid=60400) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 977.933728] env[60400]: DEBUG nova.virt.hardware [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T04:32:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T04:32:17Z,direct_url=,disk_format='vmdk',id=f5dfd970-7a56-4489-873c-2c3b6fbd9fe9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c82f07917ba4819a6bcf09e15f9f9cf',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T04:32:18Z,virtual_size=,visibility=), allow threads: False {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 977.933964] env[60400]: DEBUG nova.virt.hardware [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Flavor limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 977.934122] env[60400]: DEBUG nova.virt.hardware [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Image limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 977.934296] env[60400]: DEBUG nova.virt.hardware [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Flavor pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 977.934432] env[60400]: DEBUG nova.virt.hardware [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Image pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 977.934578] env[60400]: DEBUG nova.virt.hardware [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 977.934775] env[60400]: DEBUG nova.virt.hardware [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 977.934927] env[60400]: DEBUG nova.virt.hardware [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 977.935096] env[60400]: DEBUG nova.virt.hardware [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Got 1 possible topologies {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 977.935254] env[60400]: DEBUG nova.virt.hardware [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 977.935417] env[60400]: DEBUG nova.virt.hardware [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 977.936333] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6885c073-7b3b-4d51-8842-032dd870666e {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 977.946327] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e95ad3e-0296-4e38-83d7-0e07cf4a7227 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 978.228234] env[60400]: DEBUG nova.network.neutron [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Successfully created port: 21377081-ea82-47d1-a066-de059fb50c29 {{(pid=60400) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 978.848901] env[60400]: DEBUG nova.network.neutron [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Successfully updated port: 21377081-ea82-47d1-a066-de059fb50c29 {{(pid=60400) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 978.866499] env[60400]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Acquiring lock "refresh_cache-19881c50-a8ff-411f-b570-d4dc9ef3b0dc" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 978.866953] env[60400]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Acquired lock "refresh_cache-19881c50-a8ff-411f-b570-d4dc9ef3b0dc" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 978.867161] env[60400]: DEBUG nova.network.neutron [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Building network info cache for instance {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} [ 978.909830] env[60400]: DEBUG nova.network.neutron [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Instance cache missing network info. {{(pid=60400) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 979.132411] env[60400]: DEBUG nova.network.neutron [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Updating instance_info_cache with network_info: [{"id": "21377081-ea82-47d1-a066-de059fb50c29", "address": "fa:16:3e:d8:78:6e", "network": {"id": "7a973274-0c21-4459-87ec-3ccc09ba43e4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-537014608-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bf64f2b352e24fe39bc883bbca0e091e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8cbc9b8f-ce19-4262-bf4d-88cd4f259a1c", "external-id": "nsx-vlan-transportzone-630", "segmentation_id": 630, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap21377081-ea", "ovs_interfaceid": "21377081-ea82-47d1-a066-de059fb50c29", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 979.145572] env[60400]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Releasing lock "refresh_cache-19881c50-a8ff-411f-b570-d4dc9ef3b0dc" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 979.145572] env[60400]: DEBUG nova.compute.manager [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Instance network_info: |[{"id": "21377081-ea82-47d1-a066-de059fb50c29", "address": "fa:16:3e:d8:78:6e", "network": {"id": "7a973274-0c21-4459-87ec-3ccc09ba43e4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-537014608-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bf64f2b352e24fe39bc883bbca0e091e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8cbc9b8f-ce19-4262-bf4d-88cd4f259a1c", "external-id": "nsx-vlan-transportzone-630", "segmentation_id": 630, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap21377081-ea", "ovs_interfaceid": "21377081-ea82-47d1-a066-de059fb50c29", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 979.145822] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:d8:78:6e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '8cbc9b8f-ce19-4262-bf4d-88cd4f259a1c', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '21377081-ea82-47d1-a066-de059fb50c29', 'vif_model': 'vmxnet3'}] {{(pid=60400) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 979.153486] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Creating folder: Project (bf64f2b352e24fe39bc883bbca0e091e). Parent ref: group-v119075. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 979.154020] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3f091bfe-4226-44f6-a1b8-f27903f7dca6 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 979.165783] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Created folder: Project (bf64f2b352e24fe39bc883bbca0e091e) in parent group-v119075. [ 979.165974] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Creating folder: Instances. Parent ref: group-v119132. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 979.166226] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0bdb18dc-555b-4fc8-9ab8-607afeeef23b {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 979.176710] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Created folder: Instances in parent group-v119132. [ 979.176945] env[60400]: DEBUG oslo.service.loopingcall [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60400) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 979.177142] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Creating VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 979.177339] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8bf3935d-7b76-43f8-9041-39fa0cff52e6 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 979.197177] env[60400]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 979.197177] env[60400]: value = "task-449843" [ 979.197177] env[60400]: _type = "Task" [ 979.197177] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 979.205570] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449843, 'name': CreateVM_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 979.289508] env[60400]: DEBUG nova.compute.manager [req-eaac3b0f-7a03-4ee0-95ee-fbf55e9ecb8b req-6b469fe1-2bc9-4266-af92-02dc9bacda4a service nova] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Received event network-vif-plugged-21377081-ea82-47d1-a066-de059fb50c29 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 979.289508] env[60400]: DEBUG oslo_concurrency.lockutils [req-eaac3b0f-7a03-4ee0-95ee-fbf55e9ecb8b req-6b469fe1-2bc9-4266-af92-02dc9bacda4a service nova] Acquiring lock "19881c50-a8ff-411f-b570-d4dc9ef3b0dc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 979.289769] env[60400]: DEBUG oslo_concurrency.lockutils [req-eaac3b0f-7a03-4ee0-95ee-fbf55e9ecb8b req-6b469fe1-2bc9-4266-af92-02dc9bacda4a service nova] Lock "19881c50-a8ff-411f-b570-d4dc9ef3b0dc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 979.289937] env[60400]: DEBUG oslo_concurrency.lockutils [req-eaac3b0f-7a03-4ee0-95ee-fbf55e9ecb8b req-6b469fe1-2bc9-4266-af92-02dc9bacda4a service nova] Lock "19881c50-a8ff-411f-b570-d4dc9ef3b0dc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 979.290106] env[60400]: DEBUG nova.compute.manager [req-eaac3b0f-7a03-4ee0-95ee-fbf55e9ecb8b req-6b469fe1-2bc9-4266-af92-02dc9bacda4a service nova] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] No waiting events found dispatching network-vif-plugged-21377081-ea82-47d1-a066-de059fb50c29 {{(pid=60400) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 979.290265] env[60400]: WARNING nova.compute.manager [req-eaac3b0f-7a03-4ee0-95ee-fbf55e9ecb8b req-6b469fe1-2bc9-4266-af92-02dc9bacda4a service nova] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Received unexpected event network-vif-plugged-21377081-ea82-47d1-a066-de059fb50c29 for instance with vm_state building and task_state spawning. [ 979.290415] env[60400]: DEBUG nova.compute.manager [req-eaac3b0f-7a03-4ee0-95ee-fbf55e9ecb8b req-6b469fe1-2bc9-4266-af92-02dc9bacda4a service nova] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Received event network-changed-21377081-ea82-47d1-a066-de059fb50c29 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 979.290560] env[60400]: DEBUG nova.compute.manager [req-eaac3b0f-7a03-4ee0-95ee-fbf55e9ecb8b req-6b469fe1-2bc9-4266-af92-02dc9bacda4a service nova] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Refreshing instance network info cache due to event network-changed-21377081-ea82-47d1-a066-de059fb50c29. {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 979.290733] env[60400]: DEBUG oslo_concurrency.lockutils [req-eaac3b0f-7a03-4ee0-95ee-fbf55e9ecb8b req-6b469fe1-2bc9-4266-af92-02dc9bacda4a service nova] Acquiring lock "refresh_cache-19881c50-a8ff-411f-b570-d4dc9ef3b0dc" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 979.290899] env[60400]: DEBUG oslo_concurrency.lockutils [req-eaac3b0f-7a03-4ee0-95ee-fbf55e9ecb8b req-6b469fe1-2bc9-4266-af92-02dc9bacda4a service nova] Acquired lock "refresh_cache-19881c50-a8ff-411f-b570-d4dc9ef3b0dc" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 979.291100] env[60400]: DEBUG nova.network.neutron [req-eaac3b0f-7a03-4ee0-95ee-fbf55e9ecb8b req-6b469fe1-2bc9-4266-af92-02dc9bacda4a service nova] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Refreshing network info cache for port 21377081-ea82-47d1-a066-de059fb50c29 {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} [ 979.583200] env[60400]: DEBUG nova.network.neutron [req-eaac3b0f-7a03-4ee0-95ee-fbf55e9ecb8b req-6b469fe1-2bc9-4266-af92-02dc9bacda4a service nova] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Updated VIF entry in instance network info cache for port 21377081-ea82-47d1-a066-de059fb50c29. {{(pid=60400) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} [ 979.583550] env[60400]: DEBUG nova.network.neutron [req-eaac3b0f-7a03-4ee0-95ee-fbf55e9ecb8b req-6b469fe1-2bc9-4266-af92-02dc9bacda4a service nova] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Updating instance_info_cache with network_info: [{"id": "21377081-ea82-47d1-a066-de059fb50c29", "address": "fa:16:3e:d8:78:6e", "network": {"id": "7a973274-0c21-4459-87ec-3ccc09ba43e4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-537014608-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bf64f2b352e24fe39bc883bbca0e091e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8cbc9b8f-ce19-4262-bf4d-88cd4f259a1c", "external-id": "nsx-vlan-transportzone-630", "segmentation_id": 630, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap21377081-ea", "ovs_interfaceid": "21377081-ea82-47d1-a066-de059fb50c29", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 979.593093] env[60400]: DEBUG oslo_concurrency.lockutils [req-eaac3b0f-7a03-4ee0-95ee-fbf55e9ecb8b req-6b469fe1-2bc9-4266-af92-02dc9bacda4a service nova] Releasing lock "refresh_cache-19881c50-a8ff-411f-b570-d4dc9ef3b0dc" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 979.709609] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449843, 'name': CreateVM_Task, 'duration_secs': 0.292299} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 979.709812] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Created VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 979.710442] env[60400]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 979.710598] env[60400]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 979.710910] env[60400]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 979.711181] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1dbd0f40-dcf4-487e-bc6f-ad51d566f2e5 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 979.715687] env[60400]: DEBUG oslo_vmware.api [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Waiting for the task: (returnval){ [ 979.715687] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52e7851d-a8ca-318d-1e18-671fccf7842b" [ 979.715687] env[60400]: _type = "Task" [ 979.715687] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 979.723383] env[60400]: DEBUG oslo_vmware.api [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52e7851d-a8ca-318d-1e18-671fccf7842b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 980.232948] env[60400]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 980.233284] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Processing image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 980.233507] env[60400]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 983.001550] env[60400]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Acquiring lock "e924a9ab-71c1-4efe-a217-b036ec785dc8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 983.001942] env[60400]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Lock "e924a9ab-71c1-4efe-a217-b036ec785dc8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1002.230507] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1002.933113] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1002.933288] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Starting heal instance info cache {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 1002.933405] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Rebuilding the list of instances to heal {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 1002.944648] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1002.944810] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1002.944920] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Didn't find any instances for network info cache update. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 1003.933510] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1003.933852] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1003.933883] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1005.933134] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1005.933390] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1006.933878] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1006.934184] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60400) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 1006.934285] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1006.943780] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1006.943991] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1006.944188] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1006.944339] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60400) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1006.945416] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44f4ee0e-edd9-40ad-9c4e-0040bef3d149 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1006.954334] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fd0edbe-4546-48d0-bb9a-943a68319e45 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1006.967578] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e56856ad-80e7-44a4-807f-83b3038ba3bf {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1006.973487] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d9512ed-70d3-489f-accd-7ce0ffc7b72c {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1007.003047] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181764MB free_disk=118GB free_vcpus=48 pci_devices=None {{(pid=60400) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1007.003192] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1007.003369] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1007.042371] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance e4f0342a-4169-40aa-b234-a2e2340d5b05 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 1007.042517] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 19881c50-a8ff-411f-b570-d4dc9ef3b0dc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 1007.051794] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 0257c136-6f30-43ae-8f8d-e8f23d8328ef has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 1007.061343] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance b5ad6145-8bf0-4aed-951b-eb11dd87ed7d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 1007.070686] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance c6ee7d41-5522-4019-9da9-8503ec99e2b5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 1007.080664] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance d97a55c5-f248-482a-9986-212e84bdd0b0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 1007.088969] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 1007.098109] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance e924a9ab-71c1-4efe-a217-b036ec785dc8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} [ 1007.098311] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Total usable vcpus: 48, total allocated vcpus: 2 {{(pid=60400) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1007.098455] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=768MB phys_disk=200GB used_disk=2GB total_vcpus=48 used_vcpus=2 pci_stats=[] {{(pid=60400) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1007.195530] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c0e6e68-7a50-42e4-a4eb-edbb32c2ff3b {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1007.203320] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3318b62e-85f4-4d1b-a01f-41171473de1c {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1007.232486] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6f95671-dbdc-4ece-b0cc-a6f14f661415 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1007.239605] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eeb22c36-6d99-4785-8528-7be29d8a4211 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1007.253324] env[60400]: DEBUG nova.compute.provider_tree [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1007.261441] env[60400]: DEBUG nova.scheduler.client.report [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1007.273934] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60400) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1007.274259] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.271s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1026.453051] env[60400]: WARNING oslo_vmware.rw_handles [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1026.453051] env[60400]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1026.453051] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1026.453051] env[60400]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1026.453051] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1026.453051] env[60400]: ERROR oslo_vmware.rw_handles response.begin() [ 1026.453051] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1026.453051] env[60400]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1026.453051] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1026.453051] env[60400]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1026.453051] env[60400]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1026.453051] env[60400]: ERROR oslo_vmware.rw_handles [ 1026.453051] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Downloaded image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to vmware_temp/98b05e84-31a8-4490-8187-4d6902b34205/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1026.454955] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Caching image {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1026.455229] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Copying Virtual Disk [datastore1] vmware_temp/98b05e84-31a8-4490-8187-4d6902b34205/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk to [datastore1] vmware_temp/98b05e84-31a8-4490-8187-4d6902b34205/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk {{(pid=60400) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1026.455538] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d894c3be-834d-46c2-bb15-ef8faa8246d3 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.463166] env[60400]: DEBUG oslo_vmware.api [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Waiting for the task: (returnval){ [ 1026.463166] env[60400]: value = "task-449844" [ 1026.463166] env[60400]: _type = "Task" [ 1026.463166] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1026.471181] env[60400]: DEBUG oslo_vmware.api [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Task: {'id': task-449844, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1026.974287] env[60400]: DEBUG oslo_vmware.exceptions [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Fault InvalidArgument not matched. {{(pid=60400) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1026.974541] env[60400]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1026.975122] env[60400]: ERROR nova.compute.manager [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1026.975122] env[60400]: Faults: ['InvalidArgument'] [ 1026.975122] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Traceback (most recent call last): [ 1026.975122] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1026.975122] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] yield resources [ 1026.975122] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1026.975122] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] self.driver.spawn(context, instance, image_meta, [ 1026.975122] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1026.975122] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1026.975122] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1026.975122] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] self._fetch_image_if_missing(context, vi) [ 1026.975122] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1026.975471] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] image_cache(vi, tmp_image_ds_loc) [ 1026.975471] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1026.975471] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] vm_util.copy_virtual_disk( [ 1026.975471] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1026.975471] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] session._wait_for_task(vmdk_copy_task) [ 1026.975471] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1026.975471] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] return self.wait_for_task(task_ref) [ 1026.975471] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1026.975471] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] return evt.wait() [ 1026.975471] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1026.975471] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] result = hub.switch() [ 1026.975471] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1026.975471] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] return self.greenlet.switch() [ 1026.975865] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1026.975865] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] self.f(*self.args, **self.kw) [ 1026.975865] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1026.975865] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] raise exceptions.translate_fault(task_info.error) [ 1026.975865] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1026.975865] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Faults: ['InvalidArgument'] [ 1026.975865] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] [ 1026.975865] env[60400]: INFO nova.compute.manager [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Terminating instance [ 1026.976987] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1026.977197] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1026.977419] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-aaa35a57-afa4-4a91-8b95-c28e6adbc0c6 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.979535] env[60400]: DEBUG nova.compute.manager [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Start destroying the instance on the hypervisor. {{(pid=60400) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1026.979756] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Destroying instance {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1026.980468] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a14f65b-9701-4ffc-afee-c083fa49ebee {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.987134] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Unregistering the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1026.987348] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b6e46016-4cd2-4d77-9ea7-bdb155cea553 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.989495] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1026.989673] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60400) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1026.990624] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2959b39b-8293-4357-a6cc-5ab0ea2e0158 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.995472] env[60400]: DEBUG oslo_vmware.api [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Waiting for the task: (returnval){ [ 1026.995472] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52920349-66d1-c356-0f92-1b3329f4432c" [ 1026.995472] env[60400]: _type = "Task" [ 1026.995472] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1027.002542] env[60400]: DEBUG oslo_vmware.api [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52920349-66d1-c356-0f92-1b3329f4432c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1027.464066] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Unregistered the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1027.464336] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Deleting contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1027.464480] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Deleting the datastore file [datastore1] e4f0342a-4169-40aa-b234-a2e2340d5b05 {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1027.464739] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9f600843-b20c-46e1-8699-eeace6142ce2 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.470858] env[60400]: DEBUG oslo_vmware.api [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Waiting for the task: (returnval){ [ 1027.470858] env[60400]: value = "task-449846" [ 1027.470858] env[60400]: _type = "Task" [ 1027.470858] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1027.478155] env[60400]: DEBUG oslo_vmware.api [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Task: {'id': task-449846, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1027.504594] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Preparing fetch location {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1027.504851] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Creating directory with path [datastore1] vmware_temp/5daad7c5-8048-4bd0-be78-55ebb0fd8a23/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1027.505090] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-44dd44e8-4980-46d6-a192-5196624072d6 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.523312] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Created directory with path [datastore1] vmware_temp/5daad7c5-8048-4bd0-be78-55ebb0fd8a23/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1027.523489] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Fetch image to [datastore1] vmware_temp/5daad7c5-8048-4bd0-be78-55ebb0fd8a23/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1027.523653] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to [datastore1] vmware_temp/5daad7c5-8048-4bd0-be78-55ebb0fd8a23/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1027.524339] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45c87b1f-f544-425d-8988-ec8798d81347 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.530349] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-090e2f84-6af4-40da-926c-7095382202be {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.538937] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c934182-07f3-45a2-8ffc-1f223e698488 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.569721] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3238a64c-439f-4c8b-80fb-afc7934fcaa0 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.574935] env[60400]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a3008558-8b83-4a5c-9333-2bf626692fb4 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.593618] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1027.727481] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1027.729196] env[60400]: ERROR nova.compute.manager [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9. [ 1027.729196] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Traceback (most recent call last): [ 1027.729196] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1027.729196] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1027.729196] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1027.729196] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] result = getattr(controller, method)(*args, **kwargs) [ 1027.729196] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1027.729196] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] return self._get(image_id) [ 1027.729196] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1027.729196] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1027.729196] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1027.729539] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] resp, body = self.http_client.get(url, headers=header) [ 1027.729539] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1027.729539] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] return self.request(url, 'GET', **kwargs) [ 1027.729539] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1027.729539] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] return self._handle_response(resp) [ 1027.729539] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1027.729539] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] raise exc.from_response(resp, resp.content) [ 1027.729539] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1027.729539] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] [ 1027.729539] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] During handling of the above exception, another exception occurred: [ 1027.729539] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] [ 1027.729539] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Traceback (most recent call last): [ 1027.729888] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1027.729888] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] yield resources [ 1027.729888] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1027.729888] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] self.driver.spawn(context, instance, image_meta, [ 1027.729888] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1027.729888] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1027.729888] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1027.729888] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] self._fetch_image_if_missing(context, vi) [ 1027.729888] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1027.729888] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] image_fetch(context, vi, tmp_image_ds_loc) [ 1027.729888] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1027.729888] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] images.fetch_image( [ 1027.729888] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1027.730250] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] metadata = IMAGE_API.get(context, image_ref) [ 1027.730250] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1027.730250] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] return session.show(context, image_id, [ 1027.730250] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1027.730250] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] _reraise_translated_image_exception(image_id) [ 1027.730250] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1027.730250] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] raise new_exc.with_traceback(exc_trace) [ 1027.730250] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1027.730250] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1027.730250] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1027.730250] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] result = getattr(controller, method)(*args, **kwargs) [ 1027.730250] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1027.730250] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] return self._get(image_id) [ 1027.730585] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1027.730585] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1027.730585] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1027.730585] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] resp, body = self.http_client.get(url, headers=header) [ 1027.730585] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1027.730585] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] return self.request(url, 'GET', **kwargs) [ 1027.730585] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1027.730585] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] return self._handle_response(resp) [ 1027.730585] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1027.730585] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] raise exc.from_response(resp, resp.content) [ 1027.730585] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] nova.exception.ImageNotAuthorized: Not authorized for image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9. [ 1027.730585] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] [ 1027.730919] env[60400]: INFO nova.compute.manager [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Terminating instance [ 1027.731073] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1027.731282] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1027.731882] env[60400]: DEBUG nova.compute.manager [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Start destroying the instance on the hypervisor. {{(pid=60400) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1027.732074] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Destroying instance {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1027.732291] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6626c2c6-47dc-4f4d-8fa4-b5179a3febfb {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.735073] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2dfe43a8-96a2-4a86-84aa-51ceb09a04a3 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.742336] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Unregistering the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1027.742552] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-84df61ce-c2a4-4774-bbfb-e48b97c9fa29 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.744732] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1027.744898] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60400) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1027.745831] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-be9775a0-f2c7-43b7-ac34-ae72954e0e9b {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.750497] env[60400]: DEBUG oslo_vmware.api [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Waiting for the task: (returnval){ [ 1027.750497] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]5279530d-ddb0-d750-7755-310ad9840370" [ 1027.750497] env[60400]: _type = "Task" [ 1027.750497] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1027.757380] env[60400]: DEBUG oslo_vmware.api [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]5279530d-ddb0-d750-7755-310ad9840370, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1027.808718] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Unregistered the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1027.808938] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Deleting contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1027.809122] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Deleting the datastore file [datastore1] 63151ec9-f383-46cc-ac57-c3f7f1569410 {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1027.809363] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-57d2dabf-89bc-4bcc-8ccf-d34f1ea8d902 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.816362] env[60400]: DEBUG oslo_vmware.api [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Waiting for the task: (returnval){ [ 1027.816362] env[60400]: value = "task-449848" [ 1027.816362] env[60400]: _type = "Task" [ 1027.816362] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1027.823699] env[60400]: DEBUG oslo_vmware.api [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Task: {'id': task-449848, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1027.980895] env[60400]: DEBUG oslo_vmware.api [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Task: {'id': task-449846, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073964} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1027.981081] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Deleted the datastore file {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1027.981261] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Deleted contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1027.981431] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Instance destroyed {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1027.981600] env[60400]: INFO nova.compute.manager [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Took 1.00 seconds to destroy the instance on the hypervisor. [ 1027.983678] env[60400]: DEBUG nova.compute.claims [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Aborting claim: {{(pid=60400) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1027.983842] env[60400]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1027.984064] env[60400]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1028.112587] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2f7caa6-b4ae-40ae-97d8-cfcd9d2e681e {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.119741] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00ae8bef-1b02-4deb-9e5c-81d400efe716 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.149806] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-effb421c-c60f-4a84-a10b-3f2f21c3b1da {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.156414] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-821f55aa-18dd-488e-b191-79a1f3625a1b {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.168949] env[60400]: DEBUG nova.compute.provider_tree [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1028.177519] env[60400]: DEBUG nova.scheduler.client.report [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1028.190730] env[60400]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.207s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1028.191250] env[60400]: ERROR nova.compute.manager [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1028.191250] env[60400]: Faults: ['InvalidArgument'] [ 1028.191250] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Traceback (most recent call last): [ 1028.191250] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1028.191250] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] self.driver.spawn(context, instance, image_meta, [ 1028.191250] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1028.191250] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1028.191250] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1028.191250] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] self._fetch_image_if_missing(context, vi) [ 1028.191250] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1028.191250] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] image_cache(vi, tmp_image_ds_loc) [ 1028.191250] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1028.191576] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] vm_util.copy_virtual_disk( [ 1028.191576] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1028.191576] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] session._wait_for_task(vmdk_copy_task) [ 1028.191576] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1028.191576] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] return self.wait_for_task(task_ref) [ 1028.191576] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1028.191576] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] return evt.wait() [ 1028.191576] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1028.191576] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] result = hub.switch() [ 1028.191576] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1028.191576] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] return self.greenlet.switch() [ 1028.191576] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1028.191576] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] self.f(*self.args, **self.kw) [ 1028.191905] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1028.191905] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] raise exceptions.translate_fault(task_info.error) [ 1028.191905] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1028.191905] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Faults: ['InvalidArgument'] [ 1028.191905] env[60400]: ERROR nova.compute.manager [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] [ 1028.192061] env[60400]: DEBUG nova.compute.utils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] VimFaultException {{(pid=60400) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1028.193242] env[60400]: DEBUG nova.compute.manager [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Build of instance e4f0342a-4169-40aa-b234-a2e2340d5b05 was re-scheduled: A specified parameter was not correct: fileType [ 1028.193242] env[60400]: Faults: ['InvalidArgument'] {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1028.193603] env[60400]: DEBUG nova.compute.manager [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Unplugging VIFs for instance {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1028.193767] env[60400]: DEBUG nova.compute.manager [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1028.193930] env[60400]: DEBUG nova.compute.manager [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Deallocating network for instance {{(pid=60400) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1028.194099] env[60400]: DEBUG nova.network.neutron [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] deallocate_for_instance() {{(pid=60400) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 1028.262316] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Preparing fetch location {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1028.262573] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Creating directory with path [datastore1] vmware_temp/632aeba1-8c03-46cf-8558-403fb8c7329f/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1028.262800] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6999851e-e1ad-498b-82de-056a9625dd0e {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.274056] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Created directory with path [datastore1] vmware_temp/632aeba1-8c03-46cf-8558-403fb8c7329f/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1028.274197] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Fetch image to [datastore1] vmware_temp/632aeba1-8c03-46cf-8558-403fb8c7329f/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1028.274360] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to [datastore1] vmware_temp/632aeba1-8c03-46cf-8558-403fb8c7329f/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1028.275071] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d182c16-9bed-495c-86b2-cbafe6c8aeab {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.281273] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f99e097-b302-45cd-ac69-743f6174781b {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.289952] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f422fe3-d574-4981-9213-e8e5f52ca5fd {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.322631] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d5becbc-9dde-4031-ac7a-a0d3b45b7ee4 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.330855] env[60400]: DEBUG oslo_vmware.api [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Task: {'id': task-449848, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069452} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1028.332369] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Deleted the datastore file {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1028.332553] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Deleted contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1028.332718] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Instance destroyed {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1028.332884] env[60400]: INFO nova.compute.manager [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1028.334641] env[60400]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-44774559-100c-4b49-ad94-67d806692869 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.336580] env[60400]: DEBUG nova.compute.claims [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Aborting claim: {{(pid=60400) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1028.336744] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1028.336945] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1028.358918] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1028.362298] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.025s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1028.362917] env[60400]: DEBUG nova.compute.utils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Instance 63151ec9-f383-46cc-ac57-c3f7f1569410 could not be found. {{(pid=60400) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1028.364281] env[60400]: DEBUG nova.compute.manager [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Instance disappeared during build. {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1028.364438] env[60400]: DEBUG nova.compute.manager [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Unplugging VIFs for instance {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1028.364591] env[60400]: DEBUG nova.compute.manager [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1028.364747] env[60400]: DEBUG nova.compute.manager [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Deallocating network for instance {{(pid=60400) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1028.364898] env[60400]: DEBUG nova.network.neutron [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] deallocate_for_instance() {{(pid=60400) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 1028.400308] env[60400]: DEBUG neutronclient.v2_0.client [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60400) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1028.404453] env[60400]: ERROR nova.compute.manager [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1028.404453] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Traceback (most recent call last): [ 1028.404453] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1028.404453] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1028.404453] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1028.404453] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] result = getattr(controller, method)(*args, **kwargs) [ 1028.404453] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1028.404453] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] return self._get(image_id) [ 1028.404453] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1028.404453] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1028.404453] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1028.404453] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] resp, body = self.http_client.get(url, headers=header) [ 1028.404899] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1028.404899] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] return self.request(url, 'GET', **kwargs) [ 1028.404899] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1028.404899] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] return self._handle_response(resp) [ 1028.404899] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1028.404899] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] raise exc.from_response(resp, resp.content) [ 1028.404899] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1028.404899] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] [ 1028.404899] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] During handling of the above exception, another exception occurred: [ 1028.404899] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] [ 1028.404899] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Traceback (most recent call last): [ 1028.404899] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1028.405205] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] self.driver.spawn(context, instance, image_meta, [ 1028.405205] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1028.405205] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1028.405205] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1028.405205] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] self._fetch_image_if_missing(context, vi) [ 1028.405205] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1028.405205] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] image_fetch(context, vi, tmp_image_ds_loc) [ 1028.405205] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1028.405205] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] images.fetch_image( [ 1028.405205] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1028.405205] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] metadata = IMAGE_API.get(context, image_ref) [ 1028.405205] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1028.405205] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] return session.show(context, image_id, [ 1028.405505] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1028.405505] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] _reraise_translated_image_exception(image_id) [ 1028.405505] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1028.405505] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] raise new_exc.with_traceback(exc_trace) [ 1028.405505] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1028.405505] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1028.405505] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1028.405505] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] result = getattr(controller, method)(*args, **kwargs) [ 1028.405505] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1028.405505] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] return self._get(image_id) [ 1028.405505] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1028.405505] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1028.405505] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1028.405790] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] resp, body = self.http_client.get(url, headers=header) [ 1028.405790] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1028.405790] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] return self.request(url, 'GET', **kwargs) [ 1028.405790] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1028.405790] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] return self._handle_response(resp) [ 1028.405790] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1028.405790] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] raise exc.from_response(resp, resp.content) [ 1028.405790] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] nova.exception.ImageNotAuthorized: Not authorized for image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9. [ 1028.405790] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] [ 1028.405790] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] During handling of the above exception, another exception occurred: [ 1028.405790] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] [ 1028.405790] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Traceback (most recent call last): [ 1028.405790] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1028.406147] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] self._build_and_run_instance(context, instance, image, [ 1028.406147] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1028.406147] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] with excutils.save_and_reraise_exception(): [ 1028.406147] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1028.406147] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] self.force_reraise() [ 1028.406147] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1028.406147] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] raise self.value [ 1028.406147] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1028.406147] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] with self.rt.instance_claim(context, instance, node, allocs, [ 1028.406147] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1028.406147] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] self.abort() [ 1028.406147] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/compute/claims.py", line 85, in abort [ 1028.406147] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] self.tracker.abort_instance_claim(self.context, self.instance, [ 1028.406465] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1028.406465] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] return f(*args, **kwargs) [ 1028.406465] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1028.406465] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] self._unset_instance_host_and_node(instance) [ 1028.406465] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1028.406465] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] instance.save() [ 1028.406465] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1028.406465] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] updates, result = self.indirection_api.object_action( [ 1028.406465] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1028.406465] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] return cctxt.call(context, 'object_action', objinst=objinst, [ 1028.406465] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1028.406465] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] result = self.transport._send( [ 1028.406763] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1028.406763] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] return self._driver.send(target, ctxt, message, [ 1028.406763] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1028.406763] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1028.406763] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1028.406763] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] raise result [ 1028.406763] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] nova.exception_Remote.InstanceNotFound_Remote: Instance 63151ec9-f383-46cc-ac57-c3f7f1569410 could not be found. [ 1028.406763] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Traceback (most recent call last): [ 1028.406763] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] [ 1028.406763] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1028.406763] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] return getattr(target, method)(*args, **kwargs) [ 1028.406763] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] [ 1028.406763] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1028.407073] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] return fn(self, *args, **kwargs) [ 1028.407073] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] [ 1028.407073] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1028.407073] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] old_ref, inst_ref = db.instance_update_and_get_original( [ 1028.407073] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] [ 1028.407073] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1028.407073] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] return f(*args, **kwargs) [ 1028.407073] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] [ 1028.407073] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1028.407073] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] with excutils.save_and_reraise_exception() as ectxt: [ 1028.407073] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] [ 1028.407073] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1028.407073] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] self.force_reraise() [ 1028.407073] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] [ 1028.407073] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1028.407472] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] raise self.value [ 1028.407472] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] [ 1028.407472] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1028.407472] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] return f(*args, **kwargs) [ 1028.407472] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] [ 1028.407472] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1028.407472] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] return f(context, *args, **kwargs) [ 1028.407472] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] [ 1028.407472] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1028.407472] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1028.407472] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] [ 1028.407472] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1028.407472] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] raise exception.InstanceNotFound(instance_id=uuid) [ 1028.407472] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] [ 1028.407472] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] nova.exception.InstanceNotFound: Instance 63151ec9-f383-46cc-ac57-c3f7f1569410 could not be found. [ 1028.408392] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] [ 1028.408392] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] [ 1028.408392] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] During handling of the above exception, another exception occurred: [ 1028.408392] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] [ 1028.408392] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Traceback (most recent call last): [ 1028.408392] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1028.408392] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] ret = obj(*args, **kwargs) [ 1028.408392] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1028.408392] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] exception_handler_v20(status_code, error_body) [ 1028.408392] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1028.408392] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] raise client_exc(message=error_message, [ 1028.408392] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1028.408392] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Neutron server returns request_ids: ['req-1164b168-2d1e-4595-b06d-b6f9ff354bcf'] [ 1028.408392] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] [ 1028.409214] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] During handling of the above exception, another exception occurred: [ 1028.409214] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] [ 1028.409214] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Traceback (most recent call last): [ 1028.409214] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1028.409214] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] self._deallocate_network(context, instance, requested_networks) [ 1028.409214] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1028.409214] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] self.network_api.deallocate_for_instance( [ 1028.409214] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/network/neutron.py", line 1798, in deallocate_for_instance [ 1028.409214] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] data = neutron.list_ports(**search_opts) [ 1028.409214] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1028.409214] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] ret = obj(*args, **kwargs) [ 1028.409214] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1028.409214] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] return self.list('ports', self.ports_path, retrieve_all, [ 1028.409583] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1028.409583] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] ret = obj(*args, **kwargs) [ 1028.409583] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1028.409583] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] for r in self._pagination(collection, path, **params): [ 1028.409583] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1028.409583] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] res = self.get(path, params=params) [ 1028.409583] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1028.409583] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] ret = obj(*args, **kwargs) [ 1028.409583] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1028.409583] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] return self.retry_request("GET", action, body=body, [ 1028.409583] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1028.409583] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] ret = obj(*args, **kwargs) [ 1028.409583] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1028.409938] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] return self.do_request(method, action, body=body, [ 1028.409938] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1028.409938] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] ret = obj(*args, **kwargs) [ 1028.409938] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1028.409938] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] self._handle_fault_response(status_code, replybody, resp) [ 1028.409938] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1028.409938] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] raise exception.Unauthorized() [ 1028.409938] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] nova.exception.Unauthorized: Not authorized. [ 1028.409938] env[60400]: ERROR nova.compute.manager [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] [ 1028.429476] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Lock "63151ec9-f383-46cc-ac57-c3f7f1569410" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 318.589s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1028.439796] env[60400]: DEBUG nova.compute.manager [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Starting instance... {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1028.495433] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1028.495684] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1028.497459] env[60400]: INFO nova.compute.claims [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1028.502350] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1028.502350] env[60400]: ERROR nova.compute.manager [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9. [ 1028.502350] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Traceback (most recent call last): [ 1028.502350] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1028.502350] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1028.502350] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1028.502350] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] result = getattr(controller, method)(*args, **kwargs) [ 1028.502350] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1028.502350] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] return self._get(image_id) [ 1028.502584] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1028.502584] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1028.502584] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1028.502584] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] resp, body = self.http_client.get(url, headers=header) [ 1028.502584] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1028.502584] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] return self.request(url, 'GET', **kwargs) [ 1028.502584] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1028.502584] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] return self._handle_response(resp) [ 1028.502584] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1028.502584] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] raise exc.from_response(resp, resp.content) [ 1028.502584] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1028.502856] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] [ 1028.502856] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] During handling of the above exception, another exception occurred: [ 1028.502856] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] [ 1028.502856] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Traceback (most recent call last): [ 1028.502856] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1028.502856] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] yield resources [ 1028.502856] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1028.502856] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] self.driver.spawn(context, instance, image_meta, [ 1028.502856] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1028.502856] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1028.502856] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1028.502856] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] self._fetch_image_if_missing(context, vi) [ 1028.502856] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1028.502856] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] image_fetch(context, vi, tmp_image_ds_loc) [ 1028.503209] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1028.503209] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] images.fetch_image( [ 1028.503209] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1028.503209] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] metadata = IMAGE_API.get(context, image_ref) [ 1028.503209] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1028.503209] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] return session.show(context, image_id, [ 1028.503209] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1028.503209] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] _reraise_translated_image_exception(image_id) [ 1028.503209] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1028.503209] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] raise new_exc.with_traceback(exc_trace) [ 1028.503209] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1028.503209] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1028.503209] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1028.503491] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] result = getattr(controller, method)(*args, **kwargs) [ 1028.503491] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1028.503491] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] return self._get(image_id) [ 1028.503491] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1028.503491] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1028.503491] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1028.503491] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] resp, body = self.http_client.get(url, headers=header) [ 1028.503491] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1028.503491] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] return self.request(url, 'GET', **kwargs) [ 1028.503491] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1028.503491] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] return self._handle_response(resp) [ 1028.503491] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1028.503757] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] raise exc.from_response(resp, resp.content) [ 1028.503757] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] nova.exception.ImageNotAuthorized: Not authorized for image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9. [ 1028.503757] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] [ 1028.503757] env[60400]: INFO nova.compute.manager [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Terminating instance [ 1028.503757] env[60400]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1028.503757] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1028.504314] env[60400]: DEBUG nova.compute.manager [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Start destroying the instance on the hypervisor. {{(pid=60400) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1028.504493] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Destroying instance {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1028.504870] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5344607c-9350-4d9a-9d5a-2081654f5c6b {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.507567] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23e73e65-9146-4f60-bd3d-ea0fa4b84b20 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.514921] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Unregistering the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1028.515116] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-437be8b4-d39d-40a3-8011-32dba38e501d {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.517632] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1028.518040] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60400) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1028.518732] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ea0a372e-5397-4672-a518-12ba5000cc79 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.523556] env[60400]: DEBUG oslo_vmware.api [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Waiting for the task: (returnval){ [ 1028.523556] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52ed98a8-29f5-556a-9ada-d694b469a149" [ 1028.523556] env[60400]: _type = "Task" [ 1028.523556] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1028.530893] env[60400]: DEBUG oslo_vmware.api [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52ed98a8-29f5-556a-9ada-d694b469a149, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1028.575144] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Unregistered the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1028.575320] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Deleting contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1028.575610] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Deleting the datastore file [datastore1] f202a181-b5ea-4b06-91ad-86356b51e088 {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1028.575894] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1e9da1e8-5b1f-4a4f-a5c9-c56236d4e6c6 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.582486] env[60400]: DEBUG oslo_vmware.api [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Waiting for the task: (returnval){ [ 1028.582486] env[60400]: value = "task-449850" [ 1028.582486] env[60400]: _type = "Task" [ 1028.582486] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1028.590208] env[60400]: DEBUG oslo_vmware.api [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Task: {'id': task-449850, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1028.656936] env[60400]: DEBUG nova.network.neutron [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Updating instance_info_cache with network_info: [] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1028.670492] env[60400]: INFO nova.compute.manager [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Took 0.48 seconds to deallocate network for instance. [ 1028.674566] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46e050d3-a76f-48be-a0cd-fe13950a7a42 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.682289] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5fa04af0-7644-436e-8134-137ff0d530fd {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.717611] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-660388f8-664c-4e9d-92ad-fed44be66252 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.725540] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-352155b8-153f-439e-9d54-461c1a030d9e {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.739886] env[60400]: DEBUG nova.compute.provider_tree [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1028.747540] env[60400]: DEBUG nova.scheduler.client.report [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1028.759049] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.263s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1028.759512] env[60400]: DEBUG nova.compute.manager [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Start building networks asynchronously for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1028.772582] env[60400]: INFO nova.scheduler.client.report [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Deleted allocations for instance e4f0342a-4169-40aa-b234-a2e2340d5b05 [ 1028.793810] env[60400]: DEBUG nova.compute.utils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Using /dev/sd instead of None {{(pid=60400) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1028.795350] env[60400]: DEBUG nova.compute.manager [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Allocating IP information in the background. {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1028.795511] env[60400]: DEBUG nova.network.neutron [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] allocate_for_instance() {{(pid=60400) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1028.797400] env[60400]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Lock "e4f0342a-4169-40aa-b234-a2e2340d5b05" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 381.535s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1028.798679] env[60400]: DEBUG oslo_concurrency.lockutils [None req-7d935374-e183-4a03-b795-fbe4cedc3ae5 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Lock "e4f0342a-4169-40aa-b234-a2e2340d5b05" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 183.165s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1028.798898] env[60400]: DEBUG oslo_concurrency.lockutils [None req-7d935374-e183-4a03-b795-fbe4cedc3ae5 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Acquiring lock "e4f0342a-4169-40aa-b234-a2e2340d5b05-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1028.799148] env[60400]: DEBUG oslo_concurrency.lockutils [None req-7d935374-e183-4a03-b795-fbe4cedc3ae5 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Lock "e4f0342a-4169-40aa-b234-a2e2340d5b05-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1028.799276] env[60400]: DEBUG oslo_concurrency.lockutils [None req-7d935374-e183-4a03-b795-fbe4cedc3ae5 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Lock "e4f0342a-4169-40aa-b234-a2e2340d5b05-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1028.801311] env[60400]: INFO nova.compute.manager [None req-7d935374-e183-4a03-b795-fbe4cedc3ae5 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Terminating instance [ 1028.803178] env[60400]: DEBUG nova.compute.manager [None req-7d935374-e183-4a03-b795-fbe4cedc3ae5 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Start destroying the instance on the hypervisor. {{(pid=60400) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1028.803362] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-7d935374-e183-4a03-b795-fbe4cedc3ae5 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Destroying instance {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1028.803600] env[60400]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-709f87e1-bc6a-435b-830f-59ceed4ec3b6 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.806676] env[60400]: DEBUG nova.compute.manager [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Start building block device mappings for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1028.811349] env[60400]: DEBUG nova.compute.manager [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Starting instance... {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1028.815353] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1ccdc45-ecc0-4187-b935-e6d2281cc9d5 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.842103] env[60400]: WARNING nova.virt.vmwareapi.vmops [None req-7d935374-e183-4a03-b795-fbe4cedc3ae5 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e4f0342a-4169-40aa-b234-a2e2340d5b05 could not be found. [ 1028.842305] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-7d935374-e183-4a03-b795-fbe4cedc3ae5 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Instance destroyed {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1028.842479] env[60400]: INFO nova.compute.manager [None req-7d935374-e183-4a03-b795-fbe4cedc3ae5 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1028.842709] env[60400]: DEBUG oslo.service.loopingcall [None req-7d935374-e183-4a03-b795-fbe4cedc3ae5 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60400) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1028.847435] env[60400]: DEBUG nova.compute.manager [-] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Deallocating network for instance {{(pid=60400) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1028.847534] env[60400]: DEBUG nova.network.neutron [-] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] deallocate_for_instance() {{(pid=60400) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 1028.854569] env[60400]: DEBUG nova.policy [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7d1a7310911a431db51d3733587adb20', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ff882dc15f1f43358391269a424d2893', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60400) authorize /opt/stack/nova/nova/policy.py:203}} [ 1028.861590] env[60400]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1028.861831] env[60400]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1028.863371] env[60400]: INFO nova.compute.claims [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1028.889774] env[60400]: DEBUG nova.compute.manager [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Start spawning the instance on the hypervisor. {{(pid=60400) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1028.892373] env[60400]: DEBUG nova.network.neutron [-] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Updating instance_info_cache with network_info: [] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1028.902784] env[60400]: INFO nova.compute.manager [-] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Took 0.06 seconds to deallocate network for instance. [ 1028.918024] env[60400]: DEBUG nova.virt.hardware [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T04:32:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T04:32:17Z,direct_url=,disk_format='vmdk',id=f5dfd970-7a56-4489-873c-2c3b6fbd9fe9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c82f07917ba4819a6bcf09e15f9f9cf',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T04:32:18Z,virtual_size=,visibility=), allow threads: False {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 1028.918247] env[60400]: DEBUG nova.virt.hardware [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Flavor limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 1028.918390] env[60400]: DEBUG nova.virt.hardware [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Image limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 1028.918558] env[60400]: DEBUG nova.virt.hardware [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Flavor pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 1028.918692] env[60400]: DEBUG nova.virt.hardware [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Image pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 1028.918906] env[60400]: DEBUG nova.virt.hardware [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 1028.919024] env[60400]: DEBUG nova.virt.hardware [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 1028.919183] env[60400]: DEBUG nova.virt.hardware [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 1028.919340] env[60400]: DEBUG nova.virt.hardware [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Got 1 possible topologies {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 1028.919494] env[60400]: DEBUG nova.virt.hardware [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 1028.919676] env[60400]: DEBUG nova.virt.hardware [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 1028.920705] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdc82fdc-08fd-4600-b136-202d956d04eb {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.930821] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-262881af-b346-42c2-8220-05e7efae00c0 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.993704] env[60400]: DEBUG oslo_concurrency.lockutils [None req-7d935374-e183-4a03-b795-fbe4cedc3ae5 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Lock "e4f0342a-4169-40aa-b234-a2e2340d5b05" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.195s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1029.016323] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67dee141-eb2e-41de-b926-2d1ff19fd228 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.024241] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42dc0eaa-4d19-4b51-8b11-27494d9e442a {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.058660] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Preparing fetch location {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1029.058811] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Creating directory with path [datastore1] vmware_temp/7f7a8644-f0c7-4d15-bbfc-b3f0655768f2/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1029.059426] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-75ae424f-cb7b-40b3-bdf0-227d2ede7290 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.061556] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca2a95c7-0b0f-49e2-a23e-5ff1d1f8c4c0 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.068619] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1af0fd60-cd50-48c5-869c-397edb681a58 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.074420] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Created directory with path [datastore1] vmware_temp/7f7a8644-f0c7-4d15-bbfc-b3f0655768f2/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1029.074606] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Fetch image to [datastore1] vmware_temp/7f7a8644-f0c7-4d15-bbfc-b3f0655768f2/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1029.074767] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to [datastore1] vmware_temp/7f7a8644-f0c7-4d15-bbfc-b3f0655768f2/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1029.082652] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b281d3af-590e-4dc3-a301-ed8d84bd81eb {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.085238] env[60400]: DEBUG nova.compute.provider_tree [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1029.096290] env[60400]: DEBUG nova.scheduler.client.report [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1029.099475] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-754e8e9e-b15a-4b2e-b603-86b9e223cd70 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.103954] env[60400]: DEBUG oslo_vmware.api [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Task: {'id': task-449850, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.084328} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1029.104616] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Deleted the datastore file {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1029.104832] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Deleted contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1029.105006] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Instance destroyed {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1029.105179] env[60400]: INFO nova.compute.manager [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1029.107718] env[60400]: DEBUG nova.compute.claims [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Aborting claim: {{(pid=60400) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1029.107879] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1029.113405] env[60400]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.252s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1029.113851] env[60400]: DEBUG nova.compute.manager [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Start building networks asynchronously for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1029.117233] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d607a080-673d-4a6d-8b62-0f9c451784c9 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.121630] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.014s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1029.153859] env[60400]: DEBUG nova.network.neutron [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Successfully created port: 9b4ee6bc-272e-4e32-bf39-aa0182ccf6c8 {{(pid=60400) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1029.155926] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.034s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1029.156775] env[60400]: DEBUG nova.compute.utils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Instance f202a181-b5ea-4b06-91ad-86356b51e088 could not be found. {{(pid=60400) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1029.162018] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-286cd137-1baa-497b-841d-a0e3e87a5fad {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.162018] env[60400]: DEBUG nova.compute.manager [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Instance disappeared during build. {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1029.162018] env[60400]: DEBUG nova.compute.manager [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Unplugging VIFs for instance {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1029.162018] env[60400]: DEBUG nova.compute.manager [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1029.162018] env[60400]: DEBUG nova.compute.manager [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Deallocating network for instance {{(pid=60400) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1029.162202] env[60400]: DEBUG nova.network.neutron [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] deallocate_for_instance() {{(pid=60400) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 1029.165840] env[60400]: DEBUG nova.compute.utils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Using /dev/sd instead of None {{(pid=60400) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1029.168471] env[60400]: DEBUG nova.compute.manager [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Allocating IP information in the background. {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1029.168631] env[60400]: DEBUG nova.network.neutron [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] allocate_for_instance() {{(pid=60400) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1029.170882] env[60400]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-edd1c66b-1eb3-4078-a78c-b00c2a2be1b8 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.175099] env[60400]: DEBUG nova.compute.manager [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Start building block device mappings for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1029.192803] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1029.226152] env[60400]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1029.226923] env[60400]: ERROR nova.compute.manager [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9. [ 1029.226923] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Traceback (most recent call last): [ 1029.226923] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1029.226923] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1029.226923] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1029.226923] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] result = getattr(controller, method)(*args, **kwargs) [ 1029.226923] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1029.226923] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] return self._get(image_id) [ 1029.226923] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1029.226923] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1029.226923] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1029.227209] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] resp, body = self.http_client.get(url, headers=header) [ 1029.227209] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1029.227209] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] return self.request(url, 'GET', **kwargs) [ 1029.227209] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1029.227209] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] return self._handle_response(resp) [ 1029.227209] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1029.227209] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] raise exc.from_response(resp, resp.content) [ 1029.227209] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1029.227209] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] [ 1029.227209] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] During handling of the above exception, another exception occurred: [ 1029.227209] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] [ 1029.227209] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Traceback (most recent call last): [ 1029.227532] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1029.227532] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] yield resources [ 1029.227532] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1029.227532] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] self.driver.spawn(context, instance, image_meta, [ 1029.227532] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1029.227532] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1029.227532] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1029.227532] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] self._fetch_image_if_missing(context, vi) [ 1029.227532] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1029.227532] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] image_fetch(context, vi, tmp_image_ds_loc) [ 1029.227532] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1029.227532] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] images.fetch_image( [ 1029.227532] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1029.227940] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] metadata = IMAGE_API.get(context, image_ref) [ 1029.227940] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1029.227940] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] return session.show(context, image_id, [ 1029.227940] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1029.227940] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] _reraise_translated_image_exception(image_id) [ 1029.227940] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1029.227940] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] raise new_exc.with_traceback(exc_trace) [ 1029.227940] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1029.227940] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1029.227940] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1029.227940] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] result = getattr(controller, method)(*args, **kwargs) [ 1029.227940] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1029.227940] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] return self._get(image_id) [ 1029.228294] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1029.228294] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1029.228294] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1029.228294] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] resp, body = self.http_client.get(url, headers=header) [ 1029.228294] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1029.228294] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] return self.request(url, 'GET', **kwargs) [ 1029.228294] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1029.228294] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] return self._handle_response(resp) [ 1029.228294] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1029.228294] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] raise exc.from_response(resp, resp.content) [ 1029.228294] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] nova.exception.ImageNotAuthorized: Not authorized for image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9. [ 1029.228294] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] [ 1029.228614] env[60400]: INFO nova.compute.manager [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Terminating instance [ 1029.229272] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1029.229479] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1029.230154] env[60400]: DEBUG nova.compute.manager [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Start destroying the instance on the hypervisor. {{(pid=60400) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1029.230337] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Destroying instance {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1029.232140] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a61f0707-8e99-4b91-83e9-7ba6b31dfabb {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.234578] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16f9e8cd-3e01-4bc3-a32a-4f192539bce8 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.238619] env[60400]: DEBUG nova.policy [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1c1392a0b6d441328b27291a96c7ad84', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6c15bcc07e0a4e4fa73b77d300814d00', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60400) authorize /opt/stack/nova/nova/policy.py:203}} [ 1029.246299] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Unregistering the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1029.247778] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-62e66f2c-3122-40da-ad5d-cb1a3185695d {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.249733] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1029.249733] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60400) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1029.250234] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-381e6c60-a16c-48ab-b5b6-f79c0a3b3c60 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.255644] env[60400]: DEBUG oslo_vmware.api [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Waiting for the task: (returnval){ [ 1029.255644] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52399c7c-47b7-4dd9-f690-518806ae18af" [ 1029.255644] env[60400]: _type = "Task" [ 1029.255644] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1029.257310] env[60400]: DEBUG nova.compute.manager [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Start spawning the instance on the hypervisor. {{(pid=60400) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1029.267422] env[60400]: DEBUG oslo_vmware.api [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52399c7c-47b7-4dd9-f690-518806ae18af, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1029.294956] env[60400]: DEBUG neutronclient.v2_0.client [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60400) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1029.297337] env[60400]: ERROR nova.compute.manager [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1029.297337] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Traceback (most recent call last): [ 1029.297337] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1029.297337] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1029.297337] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1029.297337] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] result = getattr(controller, method)(*args, **kwargs) [ 1029.297337] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1029.297337] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] return self._get(image_id) [ 1029.297337] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1029.297337] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1029.297337] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1029.297337] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] resp, body = self.http_client.get(url, headers=header) [ 1029.297655] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1029.297655] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] return self.request(url, 'GET', **kwargs) [ 1029.297655] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1029.297655] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] return self._handle_response(resp) [ 1029.297655] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1029.297655] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] raise exc.from_response(resp, resp.content) [ 1029.297655] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1029.297655] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] [ 1029.297655] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] During handling of the above exception, another exception occurred: [ 1029.297655] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] [ 1029.297655] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Traceback (most recent call last): [ 1029.297655] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1029.297906] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] self.driver.spawn(context, instance, image_meta, [ 1029.297906] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1029.297906] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1029.297906] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1029.297906] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] self._fetch_image_if_missing(context, vi) [ 1029.297906] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1029.297906] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] image_fetch(context, vi, tmp_image_ds_loc) [ 1029.297906] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1029.297906] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] images.fetch_image( [ 1029.297906] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1029.297906] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] metadata = IMAGE_API.get(context, image_ref) [ 1029.297906] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1029.297906] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] return session.show(context, image_id, [ 1029.298774] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1029.298774] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] _reraise_translated_image_exception(image_id) [ 1029.298774] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1029.298774] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] raise new_exc.with_traceback(exc_trace) [ 1029.298774] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1029.298774] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1029.298774] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1029.298774] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] result = getattr(controller, method)(*args, **kwargs) [ 1029.298774] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1029.298774] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] return self._get(image_id) [ 1029.298774] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1029.298774] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1029.298774] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1029.299191] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] resp, body = self.http_client.get(url, headers=header) [ 1029.299191] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1029.299191] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] return self.request(url, 'GET', **kwargs) [ 1029.299191] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1029.299191] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] return self._handle_response(resp) [ 1029.299191] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1029.299191] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] raise exc.from_response(resp, resp.content) [ 1029.299191] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] nova.exception.ImageNotAuthorized: Not authorized for image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9. [ 1029.299191] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] [ 1029.299191] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] During handling of the above exception, another exception occurred: [ 1029.299191] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] [ 1029.299191] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Traceback (most recent call last): [ 1029.299191] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1029.299607] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] self._build_and_run_instance(context, instance, image, [ 1029.299607] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1029.299607] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] with excutils.save_and_reraise_exception(): [ 1029.299607] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1029.299607] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] self.force_reraise() [ 1029.299607] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1029.299607] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] raise self.value [ 1029.299607] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1029.299607] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] with self.rt.instance_claim(context, instance, node, allocs, [ 1029.299607] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1029.299607] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] self.abort() [ 1029.299607] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/compute/claims.py", line 85, in abort [ 1029.299607] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] self.tracker.abort_instance_claim(self.context, self.instance, [ 1029.299992] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1029.299992] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] return f(*args, **kwargs) [ 1029.299992] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1029.299992] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] self._unset_instance_host_and_node(instance) [ 1029.299992] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1029.299992] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] instance.save() [ 1029.299992] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1029.299992] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] updates, result = self.indirection_api.object_action( [ 1029.299992] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1029.299992] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] return cctxt.call(context, 'object_action', objinst=objinst, [ 1029.299992] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1029.299992] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] result = self.transport._send( [ 1029.300386] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1029.300386] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] return self._driver.send(target, ctxt, message, [ 1029.300386] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1029.300386] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1029.300386] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1029.300386] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] raise result [ 1029.300386] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] nova.exception_Remote.InstanceNotFound_Remote: Instance f202a181-b5ea-4b06-91ad-86356b51e088 could not be found. [ 1029.300386] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Traceback (most recent call last): [ 1029.300386] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] [ 1029.300386] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1029.300386] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] return getattr(target, method)(*args, **kwargs) [ 1029.300386] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] [ 1029.300386] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1029.300757] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] return fn(self, *args, **kwargs) [ 1029.300757] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] [ 1029.300757] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1029.300757] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] old_ref, inst_ref = db.instance_update_and_get_original( [ 1029.300757] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] [ 1029.300757] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1029.300757] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] return f(*args, **kwargs) [ 1029.300757] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] [ 1029.300757] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1029.300757] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] with excutils.save_and_reraise_exception() as ectxt: [ 1029.300757] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] [ 1029.300757] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1029.300757] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] self.force_reraise() [ 1029.300757] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] [ 1029.300757] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1029.301210] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] raise self.value [ 1029.301210] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] [ 1029.301210] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1029.301210] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] return f(*args, **kwargs) [ 1029.301210] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] [ 1029.301210] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1029.301210] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] return f(context, *args, **kwargs) [ 1029.301210] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] [ 1029.301210] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1029.301210] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1029.301210] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] [ 1029.301210] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1029.301210] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] raise exception.InstanceNotFound(instance_id=uuid) [ 1029.301210] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] [ 1029.301210] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] nova.exception.InstanceNotFound: Instance f202a181-b5ea-4b06-91ad-86356b51e088 could not be found. [ 1029.301627] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] [ 1029.301627] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] [ 1029.301627] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] During handling of the above exception, another exception occurred: [ 1029.301627] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] [ 1029.301627] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Traceback (most recent call last): [ 1029.301627] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1029.301627] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] ret = obj(*args, **kwargs) [ 1029.301627] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1029.301627] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] exception_handler_v20(status_code, error_body) [ 1029.301627] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1029.301627] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] raise client_exc(message=error_message, [ 1029.301627] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1029.301627] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Neutron server returns request_ids: ['req-dca1c9fd-f2b8-4954-9373-b343e1e93273'] [ 1029.301627] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] [ 1029.302081] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] During handling of the above exception, another exception occurred: [ 1029.302081] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] [ 1029.302081] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Traceback (most recent call last): [ 1029.302081] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1029.302081] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] self._deallocate_network(context, instance, requested_networks) [ 1029.302081] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1029.302081] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] self.network_api.deallocate_for_instance( [ 1029.302081] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/network/neutron.py", line 1798, in deallocate_for_instance [ 1029.302081] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] data = neutron.list_ports(**search_opts) [ 1029.302081] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1029.302081] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] ret = obj(*args, **kwargs) [ 1029.302081] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1029.302081] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] return self.list('ports', self.ports_path, retrieve_all, [ 1029.302679] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1029.302679] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] ret = obj(*args, **kwargs) [ 1029.302679] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1029.302679] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] for r in self._pagination(collection, path, **params): [ 1029.302679] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1029.302679] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] res = self.get(path, params=params) [ 1029.302679] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1029.302679] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] ret = obj(*args, **kwargs) [ 1029.302679] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1029.302679] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] return self.retry_request("GET", action, body=body, [ 1029.302679] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1029.302679] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] ret = obj(*args, **kwargs) [ 1029.302679] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1029.303024] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] return self.do_request(method, action, body=body, [ 1029.303024] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1029.303024] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] ret = obj(*args, **kwargs) [ 1029.303024] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1029.303024] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] self._handle_fault_response(status_code, replybody, resp) [ 1029.303024] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1029.303024] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] raise exception.Unauthorized() [ 1029.303024] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] nova.exception.Unauthorized: Not authorized. [ 1029.303024] env[60400]: ERROR nova.compute.manager [instance: f202a181-b5ea-4b06-91ad-86356b51e088] [ 1029.313886] env[60400]: DEBUG nova.virt.hardware [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T04:32:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T04:32:17Z,direct_url=,disk_format='vmdk',id=f5dfd970-7a56-4489-873c-2c3b6fbd9fe9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c82f07917ba4819a6bcf09e15f9f9cf',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T04:32:18Z,virtual_size=,visibility=), allow threads: False {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 1029.314125] env[60400]: DEBUG nova.virt.hardware [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Flavor limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 1029.314278] env[60400]: DEBUG nova.virt.hardware [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Image limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 1029.314562] env[60400]: DEBUG nova.virt.hardware [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Flavor pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 1029.314673] env[60400]: DEBUG nova.virt.hardware [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Image pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 1029.314907] env[60400]: DEBUG nova.virt.hardware [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 1029.314943] env[60400]: DEBUG nova.virt.hardware [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 1029.315116] env[60400]: DEBUG nova.virt.hardware [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 1029.315753] env[60400]: DEBUG nova.virt.hardware [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Got 1 possible topologies {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 1029.315753] env[60400]: DEBUG nova.virt.hardware [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 1029.315753] env[60400]: DEBUG nova.virt.hardware [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 1029.316618] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12ecbe20-5abb-40f7-9304-61145e11642c {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.323869] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Lock "f202a181-b5ea-4b06-91ad-86356b51e088" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 320.267s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1029.325020] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6bf804c2-da82-418a-8fbd-7023bf5a0172 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.340146] env[60400]: DEBUG nova.compute.manager [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Starting instance... {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1029.396930] env[60400]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1029.397235] env[60400]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1029.398702] env[60400]: INFO nova.compute.claims [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1029.588075] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc2c74a9-b403-4a0e-a990-677939eb8ae4 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.597305] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46cbff58-d036-4153-ae1d-4134c530f05c {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.639415] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53a0b113-2b11-4153-b4ed-e16bd341ffec {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.648912] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb049bb3-a6ab-4812-9fc5-5f160235a52f {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.663928] env[60400]: DEBUG nova.compute.provider_tree [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1029.665671] env[60400]: DEBUG nova.network.neutron [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Successfully created port: 64fa08d6-5cd6-4437-b6ca-08257e3f0696 {{(pid=60400) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1029.673265] env[60400]: DEBUG nova.scheduler.client.report [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1029.693575] env[60400]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.296s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1029.694281] env[60400]: DEBUG nova.compute.manager [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Start building networks asynchronously for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1029.732276] env[60400]: DEBUG nova.compute.utils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Using /dev/sd instead of None {{(pid=60400) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1029.736971] env[60400]: DEBUG nova.compute.manager [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Not allocating networking since 'none' was specified. {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 1029.742117] env[60400]: DEBUG nova.compute.manager [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Start building block device mappings for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1029.766964] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Preparing fetch location {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1029.767295] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Creating directory with path [datastore1] vmware_temp/68eda559-1c0b-4c8e-8e4d-8faad9154617/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1029.767602] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fb61180b-f5cd-4281-be01-75e207890dc4 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.790015] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Created directory with path [datastore1] vmware_temp/68eda559-1c0b-4c8e-8e4d-8faad9154617/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1029.790015] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Fetch image to [datastore1] vmware_temp/68eda559-1c0b-4c8e-8e4d-8faad9154617/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1029.790015] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to [datastore1] vmware_temp/68eda559-1c0b-4c8e-8e4d-8faad9154617/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1029.790015] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d01e5f4-9e59-49ac-8312-990bc8e71658 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.797278] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1cc84e22-e6dc-4e0a-9528-55c5245ed64f {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.808210] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f795a327-cb38-4af7-b321-a7e372b7c63a {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.815035] env[60400]: DEBUG nova.compute.manager [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Start spawning the instance on the hypervisor. {{(pid=60400) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1029.850243] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41615890-a6b5-48c7-9265-c713ae870643 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.856688] env[60400]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b3cfc4b1-f806-4903-b1fb-3fa31c95fdc4 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.868912] env[60400]: DEBUG nova.virt.hardware [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T04:32:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T04:32:17Z,direct_url=,disk_format='vmdk',id=f5dfd970-7a56-4489-873c-2c3b6fbd9fe9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c82f07917ba4819a6bcf09e15f9f9cf',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T04:32:18Z,virtual_size=,visibility=), allow threads: False {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 1029.869210] env[60400]: DEBUG nova.virt.hardware [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Flavor limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 1029.869312] env[60400]: DEBUG nova.virt.hardware [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Image limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 1029.869484] env[60400]: DEBUG nova.virt.hardware [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Flavor pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 1029.869768] env[60400]: DEBUG nova.virt.hardware [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Image pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 1029.869891] env[60400]: DEBUG nova.virt.hardware [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 1029.870100] env[60400]: DEBUG nova.virt.hardware [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 1029.870277] env[60400]: DEBUG nova.virt.hardware [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 1029.870438] env[60400]: DEBUG nova.virt.hardware [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Got 1 possible topologies {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 1029.870594] env[60400]: DEBUG nova.virt.hardware [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 1029.870866] env[60400]: DEBUG nova.virt.hardware [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 1029.871925] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95030130-381d-4dc0-b865-7b77f3427a69 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.875660] env[60400]: DEBUG nova.network.neutron [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Successfully updated port: 9b4ee6bc-272e-4e32-bf39-aa0182ccf6c8 {{(pid=60400) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1029.878111] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1029.892129] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30785473-2381-4964-89e1-9feca789a104 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.898425] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Acquiring lock "refresh_cache-0257c136-6f30-43ae-8f8d-e8f23d8328ef" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1029.898425] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Acquired lock "refresh_cache-0257c136-6f30-43ae-8f8d-e8f23d8328ef" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1029.898593] env[60400]: DEBUG nova.network.neutron [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Building network info cache for instance {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} [ 1029.910263] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Instance VIF info [] {{(pid=60400) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1029.915743] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Creating folder: Project (2549d966f11047368e896d5354721163). Parent ref: group-v119075. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1029.919061] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c732e21d-1bba-49bd-8071-31e7ac9f81c4 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.928764] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Created folder: Project (2549d966f11047368e896d5354721163) in parent group-v119075. [ 1029.928849] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Creating folder: Instances. Parent ref: group-v119135. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1029.929014] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-defa70bd-5a28-4836-b033-e1535b2aba6e {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.938556] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Created folder: Instances in parent group-v119135. [ 1029.938778] env[60400]: DEBUG oslo.service.loopingcall [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60400) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1029.938951] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Creating VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1029.939152] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-739c01fe-20b8-4508-9e8e-9b042aabd671 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.952083] env[60400]: DEBUG nova.network.neutron [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Instance cache missing network info. {{(pid=60400) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 1029.958802] env[60400]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1029.958802] env[60400]: value = "task-449854" [ 1029.958802] env[60400]: _type = "Task" [ 1029.958802] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1029.966210] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449854, 'name': CreateVM_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1030.018083] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1030.018877] env[60400]: ERROR nova.compute.manager [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9. [ 1030.018877] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Traceback (most recent call last): [ 1030.018877] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1030.018877] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1030.018877] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1030.018877] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] result = getattr(controller, method)(*args, **kwargs) [ 1030.018877] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1030.018877] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] return self._get(image_id) [ 1030.018877] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1030.018877] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1030.018877] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1030.019188] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] resp, body = self.http_client.get(url, headers=header) [ 1030.019188] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1030.019188] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] return self.request(url, 'GET', **kwargs) [ 1030.019188] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1030.019188] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] return self._handle_response(resp) [ 1030.019188] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1030.019188] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] raise exc.from_response(resp, resp.content) [ 1030.019188] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1030.019188] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] [ 1030.019188] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] During handling of the above exception, another exception occurred: [ 1030.019188] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] [ 1030.019188] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Traceback (most recent call last): [ 1030.019498] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1030.019498] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] yield resources [ 1030.019498] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1030.019498] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] self.driver.spawn(context, instance, image_meta, [ 1030.019498] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1030.019498] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1030.019498] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1030.019498] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] self._fetch_image_if_missing(context, vi) [ 1030.019498] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1030.019498] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] image_fetch(context, vi, tmp_image_ds_loc) [ 1030.019498] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1030.019498] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] images.fetch_image( [ 1030.019498] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1030.019839] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] metadata = IMAGE_API.get(context, image_ref) [ 1030.019839] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1030.019839] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] return session.show(context, image_id, [ 1030.019839] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1030.019839] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] _reraise_translated_image_exception(image_id) [ 1030.019839] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1030.019839] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] raise new_exc.with_traceback(exc_trace) [ 1030.019839] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1030.019839] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1030.019839] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1030.019839] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] result = getattr(controller, method)(*args, **kwargs) [ 1030.019839] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1030.019839] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] return self._get(image_id) [ 1030.020224] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1030.020224] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1030.020224] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1030.020224] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] resp, body = self.http_client.get(url, headers=header) [ 1030.020224] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1030.020224] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] return self.request(url, 'GET', **kwargs) [ 1030.020224] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1030.020224] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] return self._handle_response(resp) [ 1030.020224] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1030.020224] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] raise exc.from_response(resp, resp.content) [ 1030.020224] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] nova.exception.ImageNotAuthorized: Not authorized for image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9. [ 1030.020224] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] [ 1030.020546] env[60400]: INFO nova.compute.manager [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Terminating instance [ 1030.021040] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1030.021250] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1030.021874] env[60400]: DEBUG nova.compute.manager [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Start destroying the instance on the hypervisor. {{(pid=60400) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1030.022078] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Destroying instance {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1030.022313] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9d3c1a83-3052-4981-8035-994ae500656a {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1030.025309] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c0a2548-d094-499b-a1d1-f569b021848a {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1030.034794] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Unregistering the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1030.035897] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3bb9c2d6-4928-44b6-89bd-8678ffbc5ef3 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1030.037383] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1030.037586] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60400) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1030.038388] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-18e3eba1-3467-421f-86dc-36b3164a2de2 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1030.044335] env[60400]: DEBUG oslo_vmware.api [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Waiting for the task: (returnval){ [ 1030.044335] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]522febd6-2e4d-ed93-4c3d-607760149faa" [ 1030.044335] env[60400]: _type = "Task" [ 1030.044335] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1030.053199] env[60400]: DEBUG oslo_vmware.api [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]522febd6-2e4d-ed93-4c3d-607760149faa, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1030.148701] env[60400]: DEBUG nova.network.neutron [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Updating instance_info_cache with network_info: [{"id": "9b4ee6bc-272e-4e32-bf39-aa0182ccf6c8", "address": "fa:16:3e:60:96:9c", "network": {"id": "45737e3b-a83e-4653-af61-1458daa56b18", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1907417589-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ff882dc15f1f43358391269a424d2893", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "46e1fc20-2067-4e1a-9812-702772a2c82c", "external-id": "nsx-vlan-transportzone-210", "segmentation_id": 210, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9b4ee6bc-27", "ovs_interfaceid": "9b4ee6bc-272e-4e32-bf39-aa0182ccf6c8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1030.160350] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Releasing lock "refresh_cache-0257c136-6f30-43ae-8f8d-e8f23d8328ef" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1030.160658] env[60400]: DEBUG nova.compute.manager [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Instance network_info: |[{"id": "9b4ee6bc-272e-4e32-bf39-aa0182ccf6c8", "address": "fa:16:3e:60:96:9c", "network": {"id": "45737e3b-a83e-4653-af61-1458daa56b18", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1907417589-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ff882dc15f1f43358391269a424d2893", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "46e1fc20-2067-4e1a-9812-702772a2c82c", "external-id": "nsx-vlan-transportzone-210", "segmentation_id": 210, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9b4ee6bc-27", "ovs_interfaceid": "9b4ee6bc-272e-4e32-bf39-aa0182ccf6c8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1030.161125] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:60:96:9c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '46e1fc20-2067-4e1a-9812-702772a2c82c', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '9b4ee6bc-272e-4e32-bf39-aa0182ccf6c8', 'vif_model': 'vmxnet3'}] {{(pid=60400) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1030.169710] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Creating folder: Project (ff882dc15f1f43358391269a424d2893). Parent ref: group-v119075. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1030.170314] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-92353f6f-68ee-4b56-93b0-d25c8752f7df {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1030.181407] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Created folder: Project (ff882dc15f1f43358391269a424d2893) in parent group-v119075. [ 1030.181620] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Creating folder: Instances. Parent ref: group-v119138. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1030.181805] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b369452f-2976-4626-b28c-4986ee390491 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1030.192246] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Created folder: Instances in parent group-v119138. [ 1030.192246] env[60400]: DEBUG oslo.service.loopingcall [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60400) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1030.192246] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Creating VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1030.192246] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-2b386aa0-96f4-437e-9ed7-0d21c53966a0 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1030.211182] env[60400]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1030.211182] env[60400]: value = "task-449858" [ 1030.211182] env[60400]: _type = "Task" [ 1030.211182] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1030.218555] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449858, 'name': CreateVM_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1030.445459] env[60400]: DEBUG nova.compute.manager [req-6214f9cf-99a1-4d42-82ab-1ada499f53c1 req-3061fd46-62b2-4f08-8b21-3c6fb0b0f29a service nova] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Received event network-vif-plugged-64fa08d6-5cd6-4437-b6ca-08257e3f0696 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1030.445710] env[60400]: DEBUG oslo_concurrency.lockutils [req-6214f9cf-99a1-4d42-82ab-1ada499f53c1 req-3061fd46-62b2-4f08-8b21-3c6fb0b0f29a service nova] Acquiring lock "b5ad6145-8bf0-4aed-951b-eb11dd87ed7d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1030.445947] env[60400]: DEBUG oslo_concurrency.lockutils [req-6214f9cf-99a1-4d42-82ab-1ada499f53c1 req-3061fd46-62b2-4f08-8b21-3c6fb0b0f29a service nova] Lock "b5ad6145-8bf0-4aed-951b-eb11dd87ed7d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1030.446190] env[60400]: DEBUG oslo_concurrency.lockutils [req-6214f9cf-99a1-4d42-82ab-1ada499f53c1 req-3061fd46-62b2-4f08-8b21-3c6fb0b0f29a service nova] Lock "b5ad6145-8bf0-4aed-951b-eb11dd87ed7d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1030.446290] env[60400]: DEBUG nova.compute.manager [req-6214f9cf-99a1-4d42-82ab-1ada499f53c1 req-3061fd46-62b2-4f08-8b21-3c6fb0b0f29a service nova] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] No waiting events found dispatching network-vif-plugged-64fa08d6-5cd6-4437-b6ca-08257e3f0696 {{(pid=60400) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1030.446430] env[60400]: WARNING nova.compute.manager [req-6214f9cf-99a1-4d42-82ab-1ada499f53c1 req-3061fd46-62b2-4f08-8b21-3c6fb0b0f29a service nova] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Received unexpected event network-vif-plugged-64fa08d6-5cd6-4437-b6ca-08257e3f0696 for instance with vm_state building and task_state spawning. [ 1030.469094] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449854, 'name': CreateVM_Task, 'duration_secs': 0.31656} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1030.469206] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Created VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1030.469516] env[60400]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1030.469710] env[60400]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1030.470044] env[60400]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1030.470282] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6f1f9d69-947e-4b68-b562-bca9c42e1446 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1030.474678] env[60400]: DEBUG oslo_vmware.api [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Waiting for the task: (returnval){ [ 1030.474678] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]520923c0-2a7d-1381-fedb-2f1877ad59eb" [ 1030.474678] env[60400]: _type = "Task" [ 1030.474678] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1030.482475] env[60400]: DEBUG oslo_vmware.api [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]520923c0-2a7d-1381-fedb-2f1877ad59eb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1030.492178] env[60400]: DEBUG nova.network.neutron [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Successfully updated port: 64fa08d6-5cd6-4437-b6ca-08257e3f0696 {{(pid=60400) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1030.500481] env[60400]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Acquiring lock "refresh_cache-b5ad6145-8bf0-4aed-951b-eb11dd87ed7d" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1030.500638] env[60400]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Acquired lock "refresh_cache-b5ad6145-8bf0-4aed-951b-eb11dd87ed7d" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1030.500790] env[60400]: DEBUG nova.network.neutron [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Building network info cache for instance {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} [ 1030.540896] env[60400]: DEBUG nova.network.neutron [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Instance cache missing network info. {{(pid=60400) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 1030.555752] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Preparing fetch location {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1030.556009] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Creating directory with path [datastore1] vmware_temp/b20b1e11-ed4f-4b25-bdc3-15fe399d6649/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1030.556236] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b7fa5d01-bdd0-45bf-9fde-a92395fe7fba {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1030.568932] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Created directory with path [datastore1] vmware_temp/b20b1e11-ed4f-4b25-bdc3-15fe399d6649/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1030.569171] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Fetch image to [datastore1] vmware_temp/b20b1e11-ed4f-4b25-bdc3-15fe399d6649/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1030.569342] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to [datastore1] vmware_temp/b20b1e11-ed4f-4b25-bdc3-15fe399d6649/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1030.570150] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d592363d-29d0-4b0b-9a5d-7817f32d7ad8 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1030.578307] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49d1bb92-609b-465e-857c-347f13f33265 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1030.591725] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22595929-c841-4bd0-a88b-821e99d19624 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1030.630846] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71ca67b3-1f00-452c-a256-8ddd68b08ec4 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1030.640831] env[60400]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-363bc614-f0f6-4a81-907b-8a8aa1af5a7a {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1030.665716] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1030.703306] env[60400]: DEBUG nova.compute.manager [req-4b857b4f-e0ec-4c20-b67a-dba8a0049760 req-18501b71-12e4-4ea3-9f92-58dbc35ebff9 service nova] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Received event network-vif-plugged-9b4ee6bc-272e-4e32-bf39-aa0182ccf6c8 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1030.703306] env[60400]: DEBUG oslo_concurrency.lockutils [req-4b857b4f-e0ec-4c20-b67a-dba8a0049760 req-18501b71-12e4-4ea3-9f92-58dbc35ebff9 service nova] Acquiring lock "0257c136-6f30-43ae-8f8d-e8f23d8328ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1030.703306] env[60400]: DEBUG oslo_concurrency.lockutils [req-4b857b4f-e0ec-4c20-b67a-dba8a0049760 req-18501b71-12e4-4ea3-9f92-58dbc35ebff9 service nova] Lock "0257c136-6f30-43ae-8f8d-e8f23d8328ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1030.703306] env[60400]: DEBUG oslo_concurrency.lockutils [req-4b857b4f-e0ec-4c20-b67a-dba8a0049760 req-18501b71-12e4-4ea3-9f92-58dbc35ebff9 service nova] Lock "0257c136-6f30-43ae-8f8d-e8f23d8328ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1030.703527] env[60400]: DEBUG nova.compute.manager [req-4b857b4f-e0ec-4c20-b67a-dba8a0049760 req-18501b71-12e4-4ea3-9f92-58dbc35ebff9 service nova] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] No waiting events found dispatching network-vif-plugged-9b4ee6bc-272e-4e32-bf39-aa0182ccf6c8 {{(pid=60400) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1030.703527] env[60400]: WARNING nova.compute.manager [req-4b857b4f-e0ec-4c20-b67a-dba8a0049760 req-18501b71-12e4-4ea3-9f92-58dbc35ebff9 service nova] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Received unexpected event network-vif-plugged-9b4ee6bc-272e-4e32-bf39-aa0182ccf6c8 for instance with vm_state building and task_state spawning. [ 1030.703527] env[60400]: DEBUG nova.compute.manager [req-4b857b4f-e0ec-4c20-b67a-dba8a0049760 req-18501b71-12e4-4ea3-9f92-58dbc35ebff9 service nova] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Received event network-changed-9b4ee6bc-272e-4e32-bf39-aa0182ccf6c8 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1030.703942] env[60400]: DEBUG nova.compute.manager [req-4b857b4f-e0ec-4c20-b67a-dba8a0049760 req-18501b71-12e4-4ea3-9f92-58dbc35ebff9 service nova] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Refreshing instance network info cache due to event network-changed-9b4ee6bc-272e-4e32-bf39-aa0182ccf6c8. {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 1030.704323] env[60400]: DEBUG oslo_concurrency.lockutils [req-4b857b4f-e0ec-4c20-b67a-dba8a0049760 req-18501b71-12e4-4ea3-9f92-58dbc35ebff9 service nova] Acquiring lock "refresh_cache-0257c136-6f30-43ae-8f8d-e8f23d8328ef" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1030.704612] env[60400]: DEBUG oslo_concurrency.lockutils [req-4b857b4f-e0ec-4c20-b67a-dba8a0049760 req-18501b71-12e4-4ea3-9f92-58dbc35ebff9 service nova] Acquired lock "refresh_cache-0257c136-6f30-43ae-8f8d-e8f23d8328ef" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1030.705030] env[60400]: DEBUG nova.network.neutron [req-4b857b4f-e0ec-4c20-b67a-dba8a0049760 req-18501b71-12e4-4ea3-9f92-58dbc35ebff9 service nova] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Refreshing network info cache for port 9b4ee6bc-272e-4e32-bf39-aa0182ccf6c8 {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} [ 1030.722263] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449858, 'name': CreateVM_Task} progress is 25%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1030.730727] env[60400]: DEBUG nova.network.neutron [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Updating instance_info_cache with network_info: [{"id": "64fa08d6-5cd6-4437-b6ca-08257e3f0696", "address": "fa:16:3e:ee:c5:89", "network": {"id": "a080c599-4112-4b95-a2aa-a105bcae80e4", "bridge": "br-int", "label": "tempest-ServersTestJSON-1899793423-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6c15bcc07e0a4e4fa73b77d300814d00", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a874c214-8cdf-4a41-a718-84262b2a28d8", "external-id": "cl2-zone-726", "segmentation_id": 726, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap64fa08d6-5c", "ovs_interfaceid": "64fa08d6-5cd6-4437-b6ca-08257e3f0696", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1030.752115] env[60400]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Releasing lock "refresh_cache-b5ad6145-8bf0-4aed-951b-eb11dd87ed7d" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1030.752321] env[60400]: DEBUG nova.compute.manager [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Instance network_info: |[{"id": "64fa08d6-5cd6-4437-b6ca-08257e3f0696", "address": "fa:16:3e:ee:c5:89", "network": {"id": "a080c599-4112-4b95-a2aa-a105bcae80e4", "bridge": "br-int", "label": "tempest-ServersTestJSON-1899793423-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6c15bcc07e0a4e4fa73b77d300814d00", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a874c214-8cdf-4a41-a718-84262b2a28d8", "external-id": "cl2-zone-726", "segmentation_id": 726, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap64fa08d6-5c", "ovs_interfaceid": "64fa08d6-5cd6-4437-b6ca-08257e3f0696", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1030.752677] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ee:c5:89', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a874c214-8cdf-4a41-a718-84262b2a28d8', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '64fa08d6-5cd6-4437-b6ca-08257e3f0696', 'vif_model': 'vmxnet3'}] {{(pid=60400) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1030.760142] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Creating folder: Project (6c15bcc07e0a4e4fa73b77d300814d00). Parent ref: group-v119075. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1030.760639] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1e9d3e02-14d3-431b-96f8-3d778d153faf {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1030.771930] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Created folder: Project (6c15bcc07e0a4e4fa73b77d300814d00) in parent group-v119075. [ 1030.772133] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Creating folder: Instances. Parent ref: group-v119141. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1030.772334] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-eb6a9dfc-50ac-41ce-9657-a0f41964f278 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1030.781720] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Created folder: Instances in parent group-v119141. [ 1030.781931] env[60400]: DEBUG oslo.service.loopingcall [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60400) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1030.782107] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Creating VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1030.782297] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b1af5db2-425c-480f-b988-758c3947e8cd {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1030.801270] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1030.802029] env[60400]: ERROR nova.compute.manager [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9. [ 1030.802029] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Traceback (most recent call last): [ 1030.802029] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1030.802029] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1030.802029] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1030.802029] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] result = getattr(controller, method)(*args, **kwargs) [ 1030.802029] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1030.802029] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] return self._get(image_id) [ 1030.802029] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1030.802029] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1030.802029] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1030.802339] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] resp, body = self.http_client.get(url, headers=header) [ 1030.802339] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1030.802339] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] return self.request(url, 'GET', **kwargs) [ 1030.802339] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1030.802339] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] return self._handle_response(resp) [ 1030.802339] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1030.802339] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] raise exc.from_response(resp, resp.content) [ 1030.802339] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1030.802339] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] [ 1030.802339] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] During handling of the above exception, another exception occurred: [ 1030.802339] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] [ 1030.802339] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Traceback (most recent call last): [ 1030.802653] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1030.802653] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] yield resources [ 1030.802653] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1030.802653] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] self.driver.spawn(context, instance, image_meta, [ 1030.802653] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1030.802653] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1030.802653] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1030.802653] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] self._fetch_image_if_missing(context, vi) [ 1030.802653] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1030.802653] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] image_fetch(context, vi, tmp_image_ds_loc) [ 1030.802653] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1030.802653] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] images.fetch_image( [ 1030.802653] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1030.803027] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] metadata = IMAGE_API.get(context, image_ref) [ 1030.803027] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1030.803027] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] return session.show(context, image_id, [ 1030.803027] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1030.803027] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] _reraise_translated_image_exception(image_id) [ 1030.803027] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1030.803027] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] raise new_exc.with_traceback(exc_trace) [ 1030.803027] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1030.803027] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1030.803027] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1030.803027] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] result = getattr(controller, method)(*args, **kwargs) [ 1030.803027] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1030.803027] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] return self._get(image_id) [ 1030.803364] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1030.803364] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1030.803364] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1030.803364] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] resp, body = self.http_client.get(url, headers=header) [ 1030.803364] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1030.803364] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] return self.request(url, 'GET', **kwargs) [ 1030.803364] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1030.803364] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] return self._handle_response(resp) [ 1030.803364] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1030.803364] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] raise exc.from_response(resp, resp.content) [ 1030.803364] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] nova.exception.ImageNotAuthorized: Not authorized for image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9. [ 1030.803364] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] [ 1030.803676] env[60400]: INFO nova.compute.manager [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Terminating instance [ 1030.804569] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1030.804771] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1030.805407] env[60400]: DEBUG nova.compute.manager [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Start destroying the instance on the hypervisor. {{(pid=60400) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1030.805618] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Destroying instance {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1030.805850] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c09bcddf-f13c-40b7-abef-2827ffc8961a {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1030.808550] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90084586-47bb-4d37-a5fc-207b6a670d28 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1030.815223] env[60400]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1030.815223] env[60400]: value = "task-449861" [ 1030.815223] env[60400]: _type = "Task" [ 1030.815223] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1030.826057] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449861, 'name': CreateVM_Task} progress is 6%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1030.828086] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Unregistering the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1030.829016] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-1d673f15-e05d-4dd5-b2e5-2650aa645864 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1030.830448] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1030.830658] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60400) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1030.831336] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-198ea21a-d889-4dfe-ad59-b65d3a976ae5 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1030.837286] env[60400]: DEBUG oslo_vmware.api [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Waiting for the task: (returnval){ [ 1030.837286] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52164fb4-cb96-dcc7-8bed-e15b9c28afbe" [ 1030.837286] env[60400]: _type = "Task" [ 1030.837286] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1030.844928] env[60400]: DEBUG oslo_vmware.api [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52164fb4-cb96-dcc7-8bed-e15b9c28afbe, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1030.985096] env[60400]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1030.985345] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Processing image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1030.985546] env[60400]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1031.038766] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Unregistered the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1031.038956] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Deleting contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1031.039298] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Deleting the datastore file [datastore1] 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3 {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1031.040227] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-160498bc-eb92-4c03-85c1-b2c287ddba1a {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1031.043010] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Unregistered the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1031.043269] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Deleting contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1031.043474] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Deleting the datastore file [datastore1] c5b391a9-7969-4119-9bc6-b0e1fe7a9713 {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1031.046259] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-3f535aa5-2814-490f-87dd-d46d857205f2 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1031.054650] env[60400]: DEBUG oslo_vmware.api [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Waiting for the task: (returnval){ [ 1031.054650] env[60400]: value = "task-449864" [ 1031.054650] env[60400]: _type = "Task" [ 1031.054650] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1031.055978] env[60400]: DEBUG oslo_vmware.api [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Waiting for the task: (returnval){ [ 1031.055978] env[60400]: value = "task-449863" [ 1031.055978] env[60400]: _type = "Task" [ 1031.055978] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1031.063449] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Unregistered the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1031.063736] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Deleting contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1031.063981] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Deleting the datastore file [datastore1] 7476fb96-5247-472c-ab92-ef7e5916cb00 {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1031.064609] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e7ee3b06-ccf2-4806-b681-c2954568c587 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1031.070036] env[60400]: DEBUG oslo_vmware.api [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Task: {'id': task-449864, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1031.074625] env[60400]: DEBUG oslo_vmware.api [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Task: {'id': task-449863, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1031.075850] env[60400]: DEBUG oslo_vmware.api [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Waiting for the task: (returnval){ [ 1031.075850] env[60400]: value = "task-449865" [ 1031.075850] env[60400]: _type = "Task" [ 1031.075850] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1031.083851] env[60400]: DEBUG oslo_vmware.api [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Task: {'id': task-449865, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1031.202523] env[60400]: DEBUG nova.network.neutron [req-4b857b4f-e0ec-4c20-b67a-dba8a0049760 req-18501b71-12e4-4ea3-9f92-58dbc35ebff9 service nova] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Updated VIF entry in instance network info cache for port 9b4ee6bc-272e-4e32-bf39-aa0182ccf6c8. {{(pid=60400) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} [ 1031.202927] env[60400]: DEBUG nova.network.neutron [req-4b857b4f-e0ec-4c20-b67a-dba8a0049760 req-18501b71-12e4-4ea3-9f92-58dbc35ebff9 service nova] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Updating instance_info_cache with network_info: [{"id": "9b4ee6bc-272e-4e32-bf39-aa0182ccf6c8", "address": "fa:16:3e:60:96:9c", "network": {"id": "45737e3b-a83e-4653-af61-1458daa56b18", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1907417589-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ff882dc15f1f43358391269a424d2893", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "46e1fc20-2067-4e1a-9812-702772a2c82c", "external-id": "nsx-vlan-transportzone-210", "segmentation_id": 210, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9b4ee6bc-27", "ovs_interfaceid": "9b4ee6bc-272e-4e32-bf39-aa0182ccf6c8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1031.219032] env[60400]: DEBUG oslo_concurrency.lockutils [req-4b857b4f-e0ec-4c20-b67a-dba8a0049760 req-18501b71-12e4-4ea3-9f92-58dbc35ebff9 service nova] Releasing lock "refresh_cache-0257c136-6f30-43ae-8f8d-e8f23d8328ef" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1031.223198] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449858, 'name': CreateVM_Task} progress is 99%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1031.328894] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449861, 'name': CreateVM_Task, 'duration_secs': 0.442818} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1031.329065] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Created VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1031.329645] env[60400]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1031.329871] env[60400]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1031.330168] env[60400]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1031.330433] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9f5d47ec-1bbb-4edb-b5ac-3ad277d040eb {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1031.337318] env[60400]: DEBUG oslo_vmware.api [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Waiting for the task: (returnval){ [ 1031.337318] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]522db0c0-1d66-b715-8ec5-7ff6b95d35c4" [ 1031.337318] env[60400]: _type = "Task" [ 1031.337318] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1031.346338] env[60400]: DEBUG oslo_vmware.api [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]522db0c0-1d66-b715-8ec5-7ff6b95d35c4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1031.349654] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Preparing fetch location {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1031.349877] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Creating directory with path [datastore1] vmware_temp/f6135ee0-1e1b-4e1d-84bd-a48c45e9acf1/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1031.350094] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e268769b-7c11-4e68-9ae4-a20a817bacb3 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1031.365613] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Created directory with path [datastore1] vmware_temp/f6135ee0-1e1b-4e1d-84bd-a48c45e9acf1/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1031.365799] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Fetch image to [datastore1] vmware_temp/f6135ee0-1e1b-4e1d-84bd-a48c45e9acf1/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1031.365964] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to [datastore1] vmware_temp/f6135ee0-1e1b-4e1d-84bd-a48c45e9acf1/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1031.366696] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dae0c7b9-adfd-48d7-8264-bb0772febf3b {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1031.374728] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d082733-cf5f-4b5d-b15f-45a417471282 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1031.383688] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7295b18e-3541-4f57-ac5f-2391275a6577 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1031.414903] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52b0873e-8d32-4ddf-9ecd-dfbabffa029b {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1031.420442] env[60400]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-961003e1-90b3-4658-9b4a-233c7f2f62be {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1031.439114] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1031.541399] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1031.542156] env[60400]: ERROR nova.compute.manager [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9. [ 1031.542156] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Traceback (most recent call last): [ 1031.542156] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1031.542156] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1031.542156] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1031.542156] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] result = getattr(controller, method)(*args, **kwargs) [ 1031.542156] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1031.542156] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] return self._get(image_id) [ 1031.542156] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1031.542156] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1031.542156] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1031.542485] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] resp, body = self.http_client.get(url, headers=header) [ 1031.542485] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1031.542485] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] return self.request(url, 'GET', **kwargs) [ 1031.542485] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1031.542485] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] return self._handle_response(resp) [ 1031.542485] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1031.542485] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] raise exc.from_response(resp, resp.content) [ 1031.542485] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1031.542485] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] [ 1031.542485] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] During handling of the above exception, another exception occurred: [ 1031.542485] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] [ 1031.542485] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Traceback (most recent call last): [ 1031.542800] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1031.542800] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] yield resources [ 1031.542800] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1031.542800] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] self.driver.spawn(context, instance, image_meta, [ 1031.542800] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1031.542800] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1031.542800] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1031.542800] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] self._fetch_image_if_missing(context, vi) [ 1031.542800] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1031.542800] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] image_fetch(context, vi, tmp_image_ds_loc) [ 1031.542800] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1031.542800] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] images.fetch_image( [ 1031.542800] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1031.543155] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] metadata = IMAGE_API.get(context, image_ref) [ 1031.543155] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1031.543155] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] return session.show(context, image_id, [ 1031.543155] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1031.543155] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] _reraise_translated_image_exception(image_id) [ 1031.543155] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1031.543155] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] raise new_exc.with_traceback(exc_trace) [ 1031.543155] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1031.543155] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1031.543155] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1031.543155] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] result = getattr(controller, method)(*args, **kwargs) [ 1031.543155] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1031.543155] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] return self._get(image_id) [ 1031.543542] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1031.543542] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1031.543542] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1031.543542] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] resp, body = self.http_client.get(url, headers=header) [ 1031.543542] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1031.543542] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] return self.request(url, 'GET', **kwargs) [ 1031.543542] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1031.543542] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] return self._handle_response(resp) [ 1031.543542] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1031.543542] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] raise exc.from_response(resp, resp.content) [ 1031.543542] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] nova.exception.ImageNotAuthorized: Not authorized for image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9. [ 1031.543542] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] [ 1031.543806] env[60400]: INFO nova.compute.manager [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Terminating instance [ 1031.544212] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1031.544212] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1031.545082] env[60400]: DEBUG nova.compute.manager [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Start destroying the instance on the hypervisor. {{(pid=60400) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1031.545268] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Destroying instance {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1031.545485] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b724f285-5ba1-4552-b76c-bc446deb3d63 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1031.548078] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56f3ebd1-ab8c-43fd-be4c-1c0d4f9f814a {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1031.555124] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Unregistering the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1031.555363] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-1c708ca3-d35a-4131-b17f-a42f279f9683 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1031.557728] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1031.557912] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60400) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1031.561480] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7cc084ed-7655-434d-ab53-4326c9fcbdd1 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1031.569237] env[60400]: DEBUG oslo_vmware.api [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Waiting for the task: (returnval){ [ 1031.569237] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]521ac239-5ce4-00c8-f1ed-03e34b6eeff2" [ 1031.569237] env[60400]: _type = "Task" [ 1031.569237] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1031.574814] env[60400]: DEBUG oslo_vmware.api [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Task: {'id': task-449863, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.090595} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1031.575037] env[60400]: DEBUG oslo_vmware.api [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Task: {'id': task-449864, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.087642} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1031.578206] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Deleted the datastore file {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1031.578353] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Deleted contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1031.578511] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Instance destroyed {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1031.578682] env[60400]: INFO nova.compute.manager [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Took 1.56 seconds to destroy the instance on the hypervisor. [ 1031.580289] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Deleted the datastore file {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1031.580405] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Deleted contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1031.580566] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Instance destroyed {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1031.580757] env[60400]: INFO nova.compute.manager [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Took 2.35 seconds to destroy the instance on the hypervisor. [ 1031.582655] env[60400]: DEBUG nova.compute.claims [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Aborting claim: {{(pid=60400) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1031.582781] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1031.582982] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1031.597170] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Preparing fetch location {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1031.597449] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Creating directory with path [datastore1] vmware_temp/f7625c3d-ec25-4801-87b4-c4b652caf32b/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1031.597740] env[60400]: DEBUG oslo_vmware.api [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Task: {'id': task-449865, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.114106} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1031.598185] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3172ec6d-03e7-45f6-921e-bc0ce73c910a {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1031.600117] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Deleted the datastore file {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1031.600317] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Deleted contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1031.600512] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Instance destroyed {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1031.600745] env[60400]: INFO nova.compute.manager [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Took 0.80 seconds to destroy the instance on the hypervisor. [ 1031.602781] env[60400]: DEBUG nova.compute.claims [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Aborting claim: {{(pid=60400) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1031.602976] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1031.608280] env[60400]: DEBUG nova.compute.claims [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Aborting claim: {{(pid=60400) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1031.608531] env[60400]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1031.614231] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Created directory with path [datastore1] vmware_temp/f7625c3d-ec25-4801-87b4-c4b652caf32b/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1031.614489] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Fetch image to [datastore1] vmware_temp/f7625c3d-ec25-4801-87b4-c4b652caf32b/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1031.614673] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to [datastore1] vmware_temp/f7625c3d-ec25-4801-87b4-c4b652caf32b/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1031.616125] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00935a5e-f158-47d6-bb6e-c1190d4476be {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1031.618979] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.036s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1031.620052] env[60400]: DEBUG nova.compute.utils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Instance 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3 could not be found. {{(pid=60400) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1031.622177] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.019s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1031.624595] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Unregistered the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1031.624769] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Deleting contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1031.624932] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Deleting the datastore file [datastore1] 35630c7b-fdf4-4d6d-8e5a-0045f1387f93 {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1031.625985] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6a09d8a5-f35c-4c96-b750-f9800b352681 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1031.630867] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20e36961-5da3-4bb7-a409-6239e6e7870f {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1031.635120] env[60400]: DEBUG oslo_vmware.api [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Waiting for the task: (returnval){ [ 1031.635120] env[60400]: value = "task-449867" [ 1031.635120] env[60400]: _type = "Task" [ 1031.635120] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1031.635529] env[60400]: DEBUG nova.compute.manager [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Instance disappeared during build. {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1031.635661] env[60400]: DEBUG nova.compute.manager [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Unplugging VIFs for instance {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1031.635812] env[60400]: DEBUG nova.compute.manager [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1031.635967] env[60400]: DEBUG nova.compute.manager [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Deallocating network for instance {{(pid=60400) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1031.636133] env[60400]: DEBUG nova.network.neutron [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] deallocate_for_instance() {{(pid=60400) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 1031.648395] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e22b8b34-f15e-487b-8267-eead361688c6 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1031.655637] env[60400]: DEBUG oslo_vmware.api [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Task: {'id': task-449867, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1031.655992] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.034s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1031.656678] env[60400]: DEBUG nova.compute.utils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Instance 7476fb96-5247-472c-ab92-ef7e5916cb00 could not be found. {{(pid=60400) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1031.658009] env[60400]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.050s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1031.687939] env[60400]: DEBUG nova.compute.manager [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Instance disappeared during build. {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1031.687939] env[60400]: DEBUG nova.compute.manager [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Unplugging VIFs for instance {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1031.687939] env[60400]: DEBUG nova.compute.manager [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1031.687939] env[60400]: DEBUG nova.compute.manager [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Deallocating network for instance {{(pid=60400) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1031.688131] env[60400]: DEBUG nova.network.neutron [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] deallocate_for_instance() {{(pid=60400) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 1031.692041] env[60400]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.032s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1031.692041] env[60400]: DEBUG nova.compute.utils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Instance c5b391a9-7969-4119-9bc6-b0e1fe7a9713 could not be found. {{(pid=60400) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1031.693744] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-171bc9f4-fa7a-41d2-8501-d43400c30207 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1031.696946] env[60400]: DEBUG nova.compute.manager [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Instance disappeared during build. {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1031.697127] env[60400]: DEBUG nova.compute.manager [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Unplugging VIFs for instance {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1031.697284] env[60400]: DEBUG nova.compute.manager [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1031.697445] env[60400]: DEBUG nova.compute.manager [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Deallocating network for instance {{(pid=60400) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1031.697597] env[60400]: DEBUG nova.network.neutron [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] deallocate_for_instance() {{(pid=60400) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 1031.702322] env[60400]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-54aaaec6-34ae-47dc-95ba-0e6c7092d1f6 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1031.721072] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449858, 'name': CreateVM_Task, 'duration_secs': 1.04311} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1031.722453] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Created VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1031.722453] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1031.726438] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1031.727147] env[60400]: DEBUG neutronclient.v2_0.client [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60400) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1031.728165] env[60400]: ERROR nova.compute.manager [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1031.728165] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Traceback (most recent call last): [ 1031.728165] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1031.728165] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1031.728165] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1031.728165] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] result = getattr(controller, method)(*args, **kwargs) [ 1031.728165] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1031.728165] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] return self._get(image_id) [ 1031.728165] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1031.728165] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1031.728165] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1031.728660] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] resp, body = self.http_client.get(url, headers=header) [ 1031.728660] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1031.728660] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] return self.request(url, 'GET', **kwargs) [ 1031.728660] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1031.728660] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] return self._handle_response(resp) [ 1031.728660] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1031.728660] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] raise exc.from_response(resp, resp.content) [ 1031.728660] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1031.728660] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] [ 1031.728660] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] During handling of the above exception, another exception occurred: [ 1031.728660] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] [ 1031.728660] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Traceback (most recent call last): [ 1031.729510] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1031.729510] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] self.driver.spawn(context, instance, image_meta, [ 1031.729510] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1031.729510] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1031.729510] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1031.729510] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] self._fetch_image_if_missing(context, vi) [ 1031.729510] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1031.729510] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] image_fetch(context, vi, tmp_image_ds_loc) [ 1031.729510] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1031.729510] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] images.fetch_image( [ 1031.729510] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1031.729510] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] metadata = IMAGE_API.get(context, image_ref) [ 1031.729510] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1031.729819] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] return session.show(context, image_id, [ 1031.729819] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1031.729819] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] _reraise_translated_image_exception(image_id) [ 1031.729819] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1031.729819] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] raise new_exc.with_traceback(exc_trace) [ 1031.729819] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1031.729819] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1031.729819] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1031.729819] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] result = getattr(controller, method)(*args, **kwargs) [ 1031.729819] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1031.729819] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] return self._get(image_id) [ 1031.729819] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1031.729819] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1031.730349] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1031.730349] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] resp, body = self.http_client.get(url, headers=header) [ 1031.730349] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1031.730349] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] return self.request(url, 'GET', **kwargs) [ 1031.730349] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1031.730349] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] return self._handle_response(resp) [ 1031.730349] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1031.730349] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] raise exc.from_response(resp, resp.content) [ 1031.730349] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] nova.exception.ImageNotAuthorized: Not authorized for image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9. [ 1031.730349] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] [ 1031.730349] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] During handling of the above exception, another exception occurred: [ 1031.730349] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] [ 1031.730349] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Traceback (most recent call last): [ 1031.732453] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1031.732453] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] self._build_and_run_instance(context, instance, image, [ 1031.732453] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1031.732453] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] with excutils.save_and_reraise_exception(): [ 1031.732453] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1031.732453] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] self.force_reraise() [ 1031.732453] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1031.732453] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] raise self.value [ 1031.732453] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1031.732453] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] with self.rt.instance_claim(context, instance, node, allocs, [ 1031.732453] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1031.732453] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] self.abort() [ 1031.732453] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/compute/claims.py", line 85, in abort [ 1031.733766] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] self.tracker.abort_instance_claim(self.context, self.instance, [ 1031.733766] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1031.733766] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] return f(*args, **kwargs) [ 1031.733766] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1031.733766] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] self._unset_instance_host_and_node(instance) [ 1031.733766] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1031.733766] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] instance.save() [ 1031.733766] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1031.733766] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] updates, result = self.indirection_api.object_action( [ 1031.733766] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1031.733766] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] return cctxt.call(context, 'object_action', objinst=objinst, [ 1031.733766] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1031.733766] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] result = self.transport._send( [ 1031.734325] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1031.734325] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] return self._driver.send(target, ctxt, message, [ 1031.734325] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1031.734325] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1031.734325] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1031.734325] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] raise result [ 1031.734325] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] nova.exception_Remote.InstanceNotFound_Remote: Instance 7476fb96-5247-472c-ab92-ef7e5916cb00 could not be found. [ 1031.734325] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Traceback (most recent call last): [ 1031.734325] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] [ 1031.734325] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1031.734325] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] return getattr(target, method)(*args, **kwargs) [ 1031.734325] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] [ 1031.734325] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1031.735431] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] return fn(self, *args, **kwargs) [ 1031.735431] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] [ 1031.735431] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1031.735431] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] old_ref, inst_ref = db.instance_update_and_get_original( [ 1031.735431] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] [ 1031.735431] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1031.735431] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] return f(*args, **kwargs) [ 1031.735431] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] [ 1031.735431] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1031.735431] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] with excutils.save_and_reraise_exception() as ectxt: [ 1031.735431] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] [ 1031.735431] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1031.735431] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] self.force_reraise() [ 1031.735431] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] [ 1031.735431] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1031.737330] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] raise self.value [ 1031.737330] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] [ 1031.737330] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1031.737330] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] return f(*args, **kwargs) [ 1031.737330] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] [ 1031.737330] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1031.737330] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] return f(context, *args, **kwargs) [ 1031.737330] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] [ 1031.737330] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1031.737330] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1031.737330] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] [ 1031.737330] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1031.737330] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] raise exception.InstanceNotFound(instance_id=uuid) [ 1031.737330] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] [ 1031.737330] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] nova.exception.InstanceNotFound: Instance 7476fb96-5247-472c-ab92-ef7e5916cb00 could not be found. [ 1031.737910] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] [ 1031.737910] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] [ 1031.737910] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] During handling of the above exception, another exception occurred: [ 1031.737910] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] [ 1031.737910] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Traceback (most recent call last): [ 1031.737910] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1031.737910] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] ret = obj(*args, **kwargs) [ 1031.737910] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1031.737910] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] exception_handler_v20(status_code, error_body) [ 1031.737910] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1031.737910] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] raise client_exc(message=error_message, [ 1031.737910] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1031.737910] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Neutron server returns request_ids: ['req-c577b807-d83f-40ee-8266-43700f02671a'] [ 1031.737910] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] [ 1031.738457] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] During handling of the above exception, another exception occurred: [ 1031.738457] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] [ 1031.738457] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Traceback (most recent call last): [ 1031.738457] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1031.738457] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] self._deallocate_network(context, instance, requested_networks) [ 1031.738457] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1031.738457] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] self.network_api.deallocate_for_instance( [ 1031.738457] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/network/neutron.py", line 1798, in deallocate_for_instance [ 1031.738457] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] data = neutron.list_ports(**search_opts) [ 1031.738457] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1031.738457] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] ret = obj(*args, **kwargs) [ 1031.738457] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1031.738457] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] return self.list('ports', self.ports_path, retrieve_all, [ 1031.738961] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1031.738961] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] ret = obj(*args, **kwargs) [ 1031.738961] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1031.738961] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] for r in self._pagination(collection, path, **params): [ 1031.738961] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1031.738961] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] res = self.get(path, params=params) [ 1031.738961] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1031.738961] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] ret = obj(*args, **kwargs) [ 1031.738961] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1031.738961] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] return self.retry_request("GET", action, body=body, [ 1031.738961] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1031.738961] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] ret = obj(*args, **kwargs) [ 1031.738961] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1031.739475] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] return self.do_request(method, action, body=body, [ 1031.739475] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1031.739475] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] ret = obj(*args, **kwargs) [ 1031.739475] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1031.739475] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] self._handle_fault_response(status_code, replybody, resp) [ 1031.739475] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1031.739475] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] raise exception.Unauthorized() [ 1031.739475] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] nova.exception.Unauthorized: Not authorized. [ 1031.739475] env[60400]: ERROR nova.compute.manager [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] [ 1031.757555] env[60400]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Lock "7476fb96-5247-472c-ab92-ef7e5916cb00" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 315.254s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1031.771606] env[60400]: DEBUG nova.compute.manager [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Starting instance... {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1031.809057] env[60400]: DEBUG neutronclient.v2_0.client [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60400) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1031.810591] env[60400]: ERROR nova.compute.manager [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1031.810591] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Traceback (most recent call last): [ 1031.810591] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1031.810591] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1031.810591] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1031.810591] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] result = getattr(controller, method)(*args, **kwargs) [ 1031.810591] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1031.810591] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] return self._get(image_id) [ 1031.810591] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1031.810591] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1031.810591] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1031.810591] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] resp, body = self.http_client.get(url, headers=header) [ 1031.810907] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1031.810907] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] return self.request(url, 'GET', **kwargs) [ 1031.810907] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1031.810907] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] return self._handle_response(resp) [ 1031.810907] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1031.810907] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] raise exc.from_response(resp, resp.content) [ 1031.810907] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1031.810907] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] [ 1031.810907] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] During handling of the above exception, another exception occurred: [ 1031.810907] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] [ 1031.810907] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Traceback (most recent call last): [ 1031.810907] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1031.811216] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] self.driver.spawn(context, instance, image_meta, [ 1031.811216] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1031.811216] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1031.811216] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1031.811216] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] self._fetch_image_if_missing(context, vi) [ 1031.811216] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1031.811216] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] image_fetch(context, vi, tmp_image_ds_loc) [ 1031.811216] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1031.811216] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] images.fetch_image( [ 1031.811216] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1031.811216] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] metadata = IMAGE_API.get(context, image_ref) [ 1031.811216] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1031.811216] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] return session.show(context, image_id, [ 1031.811498] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1031.811498] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] _reraise_translated_image_exception(image_id) [ 1031.811498] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1031.811498] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] raise new_exc.with_traceback(exc_trace) [ 1031.811498] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1031.811498] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1031.811498] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1031.811498] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] result = getattr(controller, method)(*args, **kwargs) [ 1031.811498] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1031.811498] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] return self._get(image_id) [ 1031.811498] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1031.811498] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1031.811498] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1031.811789] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] resp, body = self.http_client.get(url, headers=header) [ 1031.811789] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1031.811789] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] return self.request(url, 'GET', **kwargs) [ 1031.811789] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1031.811789] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] return self._handle_response(resp) [ 1031.811789] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1031.811789] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] raise exc.from_response(resp, resp.content) [ 1031.811789] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] nova.exception.ImageNotAuthorized: Not authorized for image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9. [ 1031.811789] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] [ 1031.811789] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] During handling of the above exception, another exception occurred: [ 1031.811789] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] [ 1031.811789] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Traceback (most recent call last): [ 1031.811789] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1031.812098] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] self._build_and_run_instance(context, instance, image, [ 1031.812098] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1031.812098] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] with excutils.save_and_reraise_exception(): [ 1031.812098] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1031.812098] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] self.force_reraise() [ 1031.812098] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1031.812098] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] raise self.value [ 1031.812098] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1031.812098] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] with self.rt.instance_claim(context, instance, node, allocs, [ 1031.812098] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1031.812098] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] self.abort() [ 1031.812098] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/compute/claims.py", line 85, in abort [ 1031.812098] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] self.tracker.abort_instance_claim(self.context, self.instance, [ 1031.812411] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1031.812411] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] return f(*args, **kwargs) [ 1031.812411] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1031.812411] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] self._unset_instance_host_and_node(instance) [ 1031.812411] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1031.812411] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] instance.save() [ 1031.812411] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1031.812411] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] updates, result = self.indirection_api.object_action( [ 1031.812411] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1031.812411] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] return cctxt.call(context, 'object_action', objinst=objinst, [ 1031.812411] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1031.812411] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] result = self.transport._send( [ 1031.812704] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1031.812704] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] return self._driver.send(target, ctxt, message, [ 1031.812704] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1031.812704] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1031.812704] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1031.812704] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] raise result [ 1031.812704] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] nova.exception_Remote.InstanceNotFound_Remote: Instance 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3 could not be found. [ 1031.812704] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Traceback (most recent call last): [ 1031.812704] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] [ 1031.812704] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1031.812704] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] return getattr(target, method)(*args, **kwargs) [ 1031.812704] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] [ 1031.812704] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1031.813015] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] return fn(self, *args, **kwargs) [ 1031.813015] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] [ 1031.813015] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1031.813015] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] old_ref, inst_ref = db.instance_update_and_get_original( [ 1031.813015] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] [ 1031.813015] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1031.813015] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] return f(*args, **kwargs) [ 1031.813015] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] [ 1031.813015] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1031.813015] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] with excutils.save_and_reraise_exception() as ectxt: [ 1031.813015] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] [ 1031.813015] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1031.813015] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] self.force_reraise() [ 1031.813015] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] [ 1031.813015] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1031.813354] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] raise self.value [ 1031.813354] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] [ 1031.813354] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1031.813354] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] return f(*args, **kwargs) [ 1031.813354] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] [ 1031.813354] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1031.813354] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] return f(context, *args, **kwargs) [ 1031.813354] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] [ 1031.813354] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1031.813354] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1031.813354] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] [ 1031.813354] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1031.813354] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] raise exception.InstanceNotFound(instance_id=uuid) [ 1031.813354] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] [ 1031.813354] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] nova.exception.InstanceNotFound: Instance 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3 could not be found. [ 1031.813697] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] [ 1031.813697] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] [ 1031.813697] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] During handling of the above exception, another exception occurred: [ 1031.813697] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] [ 1031.813697] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Traceback (most recent call last): [ 1031.813697] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1031.813697] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] ret = obj(*args, **kwargs) [ 1031.813697] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1031.813697] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] exception_handler_v20(status_code, error_body) [ 1031.813697] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1031.813697] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] raise client_exc(message=error_message, [ 1031.813697] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1031.813697] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Neutron server returns request_ids: ['req-040ff457-8738-4732-8f60-39c2ce80a445'] [ 1031.813697] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] [ 1031.814022] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] During handling of the above exception, another exception occurred: [ 1031.814022] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] [ 1031.814022] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Traceback (most recent call last): [ 1031.814022] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1031.814022] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] self._deallocate_network(context, instance, requested_networks) [ 1031.814022] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1031.814022] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] self.network_api.deallocate_for_instance( [ 1031.814022] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/network/neutron.py", line 1798, in deallocate_for_instance [ 1031.814022] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] data = neutron.list_ports(**search_opts) [ 1031.814022] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1031.814022] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] ret = obj(*args, **kwargs) [ 1031.814022] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1031.814022] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] return self.list('ports', self.ports_path, retrieve_all, [ 1031.814304] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1031.814304] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] ret = obj(*args, **kwargs) [ 1031.814304] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1031.814304] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] for r in self._pagination(collection, path, **params): [ 1031.814304] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1031.814304] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] res = self.get(path, params=params) [ 1031.814304] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1031.814304] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] ret = obj(*args, **kwargs) [ 1031.814304] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1031.814304] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] return self.retry_request("GET", action, body=body, [ 1031.814304] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1031.814304] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] ret = obj(*args, **kwargs) [ 1031.814304] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1031.814912] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] return self.do_request(method, action, body=body, [ 1031.814912] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1031.814912] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] ret = obj(*args, **kwargs) [ 1031.814912] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1031.814912] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] self._handle_fault_response(status_code, replybody, resp) [ 1031.814912] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1031.814912] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] raise exception.Unauthorized() [ 1031.814912] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] nova.exception.Unauthorized: Not authorized. [ 1031.814912] env[60400]: ERROR nova.compute.manager [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] [ 1031.842791] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Lock "95f71b47-73c8-4a82-b806-f6f2ed9cdbb3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 316.703s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1031.843701] env[60400]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1031.843918] env[60400]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1031.846059] env[60400]: INFO nova.compute.claims [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1031.859266] env[60400]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1031.859266] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Processing image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1031.859266] env[60400]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1031.859266] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1031.859923] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1031.860175] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ec99be78-c159-4b4e-b589-ce6c3604b1d6 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1031.862716] env[60400]: DEBUG nova.compute.manager [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Starting instance... {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1031.870953] env[60400]: DEBUG oslo_vmware.api [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Waiting for the task: (returnval){ [ 1031.870953] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52629445-ccef-4aa4-cc9d-04ef9ad8dda0" [ 1031.870953] env[60400]: _type = "Task" [ 1031.870953] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1031.878905] env[60400]: DEBUG oslo_vmware.api [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52629445-ccef-4aa4-cc9d-04ef9ad8dda0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1031.912837] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1031.913727] env[60400]: ERROR nova.compute.manager [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9. [ 1031.913727] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Traceback (most recent call last): [ 1031.913727] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1031.913727] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1031.913727] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1031.913727] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] result = getattr(controller, method)(*args, **kwargs) [ 1031.913727] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1031.913727] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] return self._get(image_id) [ 1031.913727] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1031.913727] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1031.913727] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1031.914020] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] resp, body = self.http_client.get(url, headers=header) [ 1031.914020] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1031.914020] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] return self.request(url, 'GET', **kwargs) [ 1031.914020] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1031.914020] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] return self._handle_response(resp) [ 1031.914020] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1031.914020] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] raise exc.from_response(resp, resp.content) [ 1031.914020] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1031.914020] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] [ 1031.914020] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] During handling of the above exception, another exception occurred: [ 1031.914020] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] [ 1031.914020] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Traceback (most recent call last): [ 1031.914288] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1031.914288] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] yield resources [ 1031.914288] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1031.914288] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] self.driver.spawn(context, instance, image_meta, [ 1031.914288] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1031.914288] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1031.914288] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1031.914288] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] self._fetch_image_if_missing(context, vi) [ 1031.914288] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1031.914288] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] image_fetch(context, vi, tmp_image_ds_loc) [ 1031.914288] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1031.914288] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] images.fetch_image( [ 1031.914288] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1031.914576] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] metadata = IMAGE_API.get(context, image_ref) [ 1031.914576] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1031.914576] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] return session.show(context, image_id, [ 1031.914576] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1031.914576] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] _reraise_translated_image_exception(image_id) [ 1031.914576] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1031.914576] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] raise new_exc.with_traceback(exc_trace) [ 1031.914576] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1031.914576] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1031.914576] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1031.914576] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] result = getattr(controller, method)(*args, **kwargs) [ 1031.914576] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1031.914576] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] return self._get(image_id) [ 1031.914855] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1031.914855] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1031.914855] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1031.914855] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] resp, body = self.http_client.get(url, headers=header) [ 1031.914855] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1031.914855] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] return self.request(url, 'GET', **kwargs) [ 1031.914855] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1031.914855] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] return self._handle_response(resp) [ 1031.914855] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1031.914855] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] raise exc.from_response(resp, resp.content) [ 1031.914855] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] nova.exception.ImageNotAuthorized: Not authorized for image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9. [ 1031.914855] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] [ 1031.915170] env[60400]: INFO nova.compute.manager [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Terminating instance [ 1031.916044] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1031.916258] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1031.916885] env[60400]: DEBUG nova.compute.manager [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Start destroying the instance on the hypervisor. {{(pid=60400) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1031.917788] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Destroying instance {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1031.919928] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-98c09bfc-7132-4c8c-8fba-e72baebf950a {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1031.923479] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a42dd37-1f37-4f92-9d80-d8c91b3c90fe {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1031.934960] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Unregistering the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1031.935212] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b7b106ea-8691-4f9e-8d8a-4e3db5b0de19 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1031.940078] env[60400]: DEBUG oslo_concurrency.lockutils [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1031.943339] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1031.943508] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60400) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1031.944304] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b802167a-54b3-48d9-8f82-cd82b8293520 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1031.951517] env[60400]: DEBUG oslo_vmware.api [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Waiting for the task: (returnval){ [ 1031.951517] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52cfd7cf-fccc-6110-95be-c597d512765c" [ 1031.951517] env[60400]: _type = "Task" [ 1031.951517] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1031.960218] env[60400]: DEBUG oslo_vmware.api [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52cfd7cf-fccc-6110-95be-c597d512765c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1031.977418] env[60400]: DEBUG neutronclient.v2_0.client [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60400) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1031.981184] env[60400]: ERROR nova.compute.manager [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1031.981184] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Traceback (most recent call last): [ 1031.981184] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1031.981184] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1031.981184] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1031.981184] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] result = getattr(controller, method)(*args, **kwargs) [ 1031.981184] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1031.981184] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] return self._get(image_id) [ 1031.981184] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1031.981184] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1031.981184] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1031.981184] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] resp, body = self.http_client.get(url, headers=header) [ 1031.981562] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1031.981562] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] return self.request(url, 'GET', **kwargs) [ 1031.981562] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1031.981562] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] return self._handle_response(resp) [ 1031.981562] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1031.981562] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] raise exc.from_response(resp, resp.content) [ 1031.981562] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1031.981562] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] [ 1031.981562] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] During handling of the above exception, another exception occurred: [ 1031.981562] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] [ 1031.981562] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Traceback (most recent call last): [ 1031.981562] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1031.981832] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] self.driver.spawn(context, instance, image_meta, [ 1031.981832] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1031.981832] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1031.981832] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1031.981832] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] self._fetch_image_if_missing(context, vi) [ 1031.981832] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1031.981832] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] image_fetch(context, vi, tmp_image_ds_loc) [ 1031.981832] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1031.981832] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] images.fetch_image( [ 1031.981832] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1031.981832] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] metadata = IMAGE_API.get(context, image_ref) [ 1031.981832] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1031.981832] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] return session.show(context, image_id, [ 1031.983399] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1031.983399] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] _reraise_translated_image_exception(image_id) [ 1031.983399] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1031.983399] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] raise new_exc.with_traceback(exc_trace) [ 1031.983399] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1031.983399] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1031.983399] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1031.983399] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] result = getattr(controller, method)(*args, **kwargs) [ 1031.983399] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1031.983399] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] return self._get(image_id) [ 1031.983399] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1031.983399] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1031.983399] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1031.983834] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] resp, body = self.http_client.get(url, headers=header) [ 1031.983834] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1031.983834] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] return self.request(url, 'GET', **kwargs) [ 1031.983834] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1031.983834] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] return self._handle_response(resp) [ 1031.983834] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1031.983834] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] raise exc.from_response(resp, resp.content) [ 1031.983834] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] nova.exception.ImageNotAuthorized: Not authorized for image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9. [ 1031.983834] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] [ 1031.983834] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] During handling of the above exception, another exception occurred: [ 1031.983834] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] [ 1031.983834] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Traceback (most recent call last): [ 1031.983834] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1031.984254] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] self._build_and_run_instance(context, instance, image, [ 1031.984254] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1031.984254] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] with excutils.save_and_reraise_exception(): [ 1031.984254] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1031.984254] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] self.force_reraise() [ 1031.984254] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1031.984254] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] raise self.value [ 1031.984254] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1031.984254] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] with self.rt.instance_claim(context, instance, node, allocs, [ 1031.984254] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1031.984254] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] self.abort() [ 1031.984254] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/compute/claims.py", line 85, in abort [ 1031.984254] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] self.tracker.abort_instance_claim(self.context, self.instance, [ 1031.984700] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1031.984700] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] return f(*args, **kwargs) [ 1031.984700] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1031.984700] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] self._unset_instance_host_and_node(instance) [ 1031.984700] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1031.984700] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] instance.save() [ 1031.984700] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1031.984700] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] updates, result = self.indirection_api.object_action( [ 1031.984700] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1031.984700] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] return cctxt.call(context, 'object_action', objinst=objinst, [ 1031.984700] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1031.984700] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] result = self.transport._send( [ 1031.985385] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1031.985385] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] return self._driver.send(target, ctxt, message, [ 1031.985385] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1031.985385] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1031.985385] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1031.985385] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] raise result [ 1031.985385] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] nova.exception_Remote.InstanceNotFound_Remote: Instance c5b391a9-7969-4119-9bc6-b0e1fe7a9713 could not be found. [ 1031.985385] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Traceback (most recent call last): [ 1031.985385] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] [ 1031.985385] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1031.985385] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] return getattr(target, method)(*args, **kwargs) [ 1031.985385] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] [ 1031.985385] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1031.985879] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] return fn(self, *args, **kwargs) [ 1031.985879] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] [ 1031.985879] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1031.985879] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] old_ref, inst_ref = db.instance_update_and_get_original( [ 1031.985879] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] [ 1031.985879] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1031.985879] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] return f(*args, **kwargs) [ 1031.985879] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] [ 1031.985879] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1031.985879] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] with excutils.save_and_reraise_exception() as ectxt: [ 1031.985879] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] [ 1031.985879] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1031.985879] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] self.force_reraise() [ 1031.985879] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] [ 1031.985879] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1031.986389] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] raise self.value [ 1031.986389] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] [ 1031.986389] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1031.986389] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] return f(*args, **kwargs) [ 1031.986389] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] [ 1031.986389] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1031.986389] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] return f(context, *args, **kwargs) [ 1031.986389] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] [ 1031.986389] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1031.986389] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1031.986389] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] [ 1031.986389] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1031.986389] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] raise exception.InstanceNotFound(instance_id=uuid) [ 1031.986389] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] [ 1031.986389] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] nova.exception.InstanceNotFound: Instance c5b391a9-7969-4119-9bc6-b0e1fe7a9713 could not be found. [ 1031.987230] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] [ 1031.987230] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] [ 1031.987230] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] During handling of the above exception, another exception occurred: [ 1031.987230] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] [ 1031.987230] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Traceback (most recent call last): [ 1031.987230] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1031.987230] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] ret = obj(*args, **kwargs) [ 1031.987230] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1031.987230] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] exception_handler_v20(status_code, error_body) [ 1031.987230] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1031.987230] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] raise client_exc(message=error_message, [ 1031.987230] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1031.987230] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Neutron server returns request_ids: ['req-9cde4db3-e508-4e8d-b89a-e7391af37a64'] [ 1031.987230] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] [ 1031.987911] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] During handling of the above exception, another exception occurred: [ 1031.987911] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] [ 1031.987911] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Traceback (most recent call last): [ 1031.987911] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1031.987911] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] self._deallocate_network(context, instance, requested_networks) [ 1031.987911] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1031.987911] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] self.network_api.deallocate_for_instance( [ 1031.987911] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/network/neutron.py", line 1798, in deallocate_for_instance [ 1031.987911] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] data = neutron.list_ports(**search_opts) [ 1031.987911] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1031.987911] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] ret = obj(*args, **kwargs) [ 1031.987911] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1031.987911] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] return self.list('ports', self.ports_path, retrieve_all, [ 1031.988528] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1031.988528] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] ret = obj(*args, **kwargs) [ 1031.988528] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1031.988528] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] for r in self._pagination(collection, path, **params): [ 1031.988528] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1031.988528] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] res = self.get(path, params=params) [ 1031.988528] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1031.988528] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] ret = obj(*args, **kwargs) [ 1031.988528] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1031.988528] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] return self.retry_request("GET", action, body=body, [ 1031.988528] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1031.988528] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] ret = obj(*args, **kwargs) [ 1031.988528] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1031.989904] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] return self.do_request(method, action, body=body, [ 1031.989904] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1031.989904] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] ret = obj(*args, **kwargs) [ 1031.989904] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1031.989904] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] self._handle_fault_response(status_code, replybody, resp) [ 1031.989904] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1031.989904] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] raise exception.Unauthorized() [ 1031.989904] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] nova.exception.Unauthorized: Not authorized. [ 1031.989904] env[60400]: ERROR nova.compute.manager [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] [ 1031.998601] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Unregistered the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1031.998806] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Deleting contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1031.998981] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Deleting the datastore file [datastore1] 837197c0-9ff8-45a2-8bf0-730158a43a17 {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1031.999238] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-5b5f40bf-9dc4-40e0-87a5-5a736a3284d1 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.006981] env[60400]: DEBUG oslo_vmware.api [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Waiting for the task: (returnval){ [ 1032.006981] env[60400]: value = "task-449869" [ 1032.006981] env[60400]: _type = "Task" [ 1032.006981] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1032.011196] env[60400]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Lock "c5b391a9-7969-4119-9bc6-b0e1fe7a9713" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 317.005s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1032.016700] env[60400]: DEBUG oslo_vmware.api [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Task: {'id': task-449869, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1032.025431] env[60400]: DEBUG nova.compute.manager [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Starting instance... {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1032.055260] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ee20cb3-ec16-4ca4-ba39-25e5b5831402 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.066794] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-749b9806-e71b-49fb-9e40-63d182aee507 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.097329] env[60400]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1032.098080] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f3efef5-99e0-4230-9fc7-327b7bd173cc {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.105652] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-326cb4c8-c233-4a05-a485-14538005b454 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.119761] env[60400]: DEBUG nova.compute.provider_tree [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1032.129006] env[60400]: DEBUG nova.scheduler.client.report [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1032.143527] env[60400]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.300s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1032.144184] env[60400]: DEBUG nova.compute.manager [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Start building networks asynchronously for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1032.147755] env[60400]: DEBUG oslo_concurrency.lockutils [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.208s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1032.149061] env[60400]: INFO nova.compute.claims [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1032.154854] env[60400]: DEBUG oslo_vmware.api [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Task: {'id': task-449867, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07174} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1032.155075] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Deleted the datastore file {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1032.155248] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Deleted contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1032.155409] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Instance destroyed {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1032.155565] env[60400]: INFO nova.compute.manager [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1032.157611] env[60400]: DEBUG nova.compute.claims [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Aborting claim: {{(pid=60400) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1032.157769] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1032.185273] env[60400]: DEBUG nova.compute.utils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Using /dev/sd instead of None {{(pid=60400) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1032.186864] env[60400]: DEBUG nova.compute.manager [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Not allocating networking since 'none' was specified. {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 1032.193362] env[60400]: DEBUG nova.compute.manager [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Start building block device mappings for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1032.264078] env[60400]: DEBUG nova.compute.manager [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Start spawning the instance on the hypervisor. {{(pid=60400) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1032.287226] env[60400]: DEBUG nova.virt.hardware [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T04:32:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T04:32:17Z,direct_url=,disk_format='vmdk',id=f5dfd970-7a56-4489-873c-2c3b6fbd9fe9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c82f07917ba4819a6bcf09e15f9f9cf',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T04:32:18Z,virtual_size=,visibility=), allow threads: False {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 1032.287226] env[60400]: DEBUG nova.virt.hardware [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Flavor limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 1032.287226] env[60400]: DEBUG nova.virt.hardware [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Image limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 1032.287463] env[60400]: DEBUG nova.virt.hardware [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Flavor pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 1032.287463] env[60400]: DEBUG nova.virt.hardware [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Image pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 1032.287463] env[60400]: DEBUG nova.virt.hardware [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 1032.287463] env[60400]: DEBUG nova.virt.hardware [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 1032.287463] env[60400]: DEBUG nova.virt.hardware [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 1032.287625] env[60400]: DEBUG nova.virt.hardware [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Got 1 possible topologies {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 1032.287625] env[60400]: DEBUG nova.virt.hardware [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 1032.287625] env[60400]: DEBUG nova.virt.hardware [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 1032.289839] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff8dfc95-cc6c-4f3e-b252-ab09d2a75027 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.299637] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f71594d2-7959-46f5-be7d-57a70afbb949 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.315435] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Instance VIF info [] {{(pid=60400) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1032.321032] env[60400]: DEBUG oslo.service.loopingcall [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60400) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1032.322256] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Creating VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1032.322632] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf99b959-d2f8-44bb-9dc9-4c1af808b5f5 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.324994] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-619c047f-6802-4658-b2d2-a13d8b690acb {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.342410] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17d60d54-401d-4e34-a4f7-1aa2797f802e {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.345475] env[60400]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1032.345475] env[60400]: value = "task-449870" [ 1032.345475] env[60400]: _type = "Task" [ 1032.345475] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1032.378989] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e350c64f-3758-4faa-b8df-714edbc7396a {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.384705] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449870, 'name': CreateVM_Task} progress is 15%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1032.392812] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36608fd7-319c-47a7-90a9-b5f6de56a746 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.397108] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1032.397385] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Processing image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1032.397613] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1032.407900] env[60400]: DEBUG nova.compute.provider_tree [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1032.417308] env[60400]: DEBUG nova.scheduler.client.report [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1032.431220] env[60400]: DEBUG oslo_concurrency.lockutils [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.284s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1032.431575] env[60400]: DEBUG nova.compute.manager [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Start building networks asynchronously for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1032.434180] env[60400]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.337s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1032.435576] env[60400]: INFO nova.compute.claims [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1032.461847] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Preparing fetch location {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1032.462158] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Creating directory with path [datastore1] vmware_temp/c8a6849e-df04-4dcd-8408-11737ef8e7c6/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1032.462359] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-dc8c5a14-1360-4a57-83b1-b813f19698ed {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.469330] env[60400]: DEBUG nova.compute.utils [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Using /dev/sd instead of None {{(pid=60400) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1032.470624] env[60400]: DEBUG nova.compute.manager [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Allocating IP information in the background. {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1032.470813] env[60400]: DEBUG nova.network.neutron [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] allocate_for_instance() {{(pid=60400) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1032.474206] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Created directory with path [datastore1] vmware_temp/c8a6849e-df04-4dcd-8408-11737ef8e7c6/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1032.474388] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Fetch image to [datastore1] vmware_temp/c8a6849e-df04-4dcd-8408-11737ef8e7c6/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1032.474550] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to [datastore1] vmware_temp/c8a6849e-df04-4dcd-8408-11737ef8e7c6/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1032.475577] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddda433e-0019-43ed-a8a2-41abac881a8b {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.482103] env[60400]: DEBUG nova.compute.manager [req-d990310f-d334-4b0b-9579-341b0cd2a094 req-5b221815-e45d-42c4-bb90-5038a9d23458 service nova] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Received event network-changed-64fa08d6-5cd6-4437-b6ca-08257e3f0696 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1032.482281] env[60400]: DEBUG nova.compute.manager [req-d990310f-d334-4b0b-9579-341b0cd2a094 req-5b221815-e45d-42c4-bb90-5038a9d23458 service nova] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Refreshing instance network info cache due to event network-changed-64fa08d6-5cd6-4437-b6ca-08257e3f0696. {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 1032.482479] env[60400]: DEBUG oslo_concurrency.lockutils [req-d990310f-d334-4b0b-9579-341b0cd2a094 req-5b221815-e45d-42c4-bb90-5038a9d23458 service nova] Acquiring lock "refresh_cache-b5ad6145-8bf0-4aed-951b-eb11dd87ed7d" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1032.482612] env[60400]: DEBUG oslo_concurrency.lockutils [req-d990310f-d334-4b0b-9579-341b0cd2a094 req-5b221815-e45d-42c4-bb90-5038a9d23458 service nova] Acquired lock "refresh_cache-b5ad6145-8bf0-4aed-951b-eb11dd87ed7d" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1032.482761] env[60400]: DEBUG nova.network.neutron [req-d990310f-d334-4b0b-9579-341b0cd2a094 req-5b221815-e45d-42c4-bb90-5038a9d23458 service nova] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Refreshing network info cache for port 64fa08d6-5cd6-4437-b6ca-08257e3f0696 {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} [ 1032.485846] env[60400]: DEBUG nova.compute.manager [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Start building block device mappings for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1032.500051] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-487d7239-7ca1-4bba-b088-08c5914163a0 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.520712] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b122b1a7-ae0d-428d-839e-61239f978bd8 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.535556] env[60400]: DEBUG oslo_vmware.api [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Task: {'id': task-449869, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072876} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1032.562315] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Deleted the datastore file {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1032.562588] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Deleted contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1032.562798] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Instance destroyed {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1032.562976] env[60400]: INFO nova.compute.manager [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Took 0.65 seconds to destroy the instance on the hypervisor. [ 1032.569863] env[60400]: DEBUG nova.compute.claims [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Aborting claim: {{(pid=60400) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1032.570055] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1032.571483] env[60400]: DEBUG nova.policy [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '349d330e9c374dbdab47582c51ca9168', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a1fb7769ccc2463094e0dd138a59226e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60400) authorize /opt/stack/nova/nova/policy.py:203}} [ 1032.573904] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aec78014-475b-42c5-8403-16accafc4e87 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.577568] env[60400]: INFO nova.virt.block_device [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Booting with volume 12c96912-3d03-4b7d-9d94-8ff71c6cc5d0 at /dev/sda [ 1032.585718] env[60400]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-2b1817e9-57c5-4bf8-9f8b-8756848a4a9e {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.608417] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1032.639903] env[60400]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c50990cb-f244-480e-b10e-f3a0677ec3d5 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.653746] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5fa726cf-cc07-4e6f-ab4e-8c02103b5a24 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.664863] env[60400]: DEBUG oslo_vmware.rw_handles [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c8a6849e-df04-4dcd-8408-11737ef8e7c6/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60400) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1032.723900] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eca4754b-d3db-425c-a4b1-8cb72d4a6d4c {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.737794] env[60400]: DEBUG oslo_vmware.rw_handles [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Completed reading data from the image iterator. {{(pid=60400) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1032.738122] env[60400]: DEBUG oslo_vmware.rw_handles [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c8a6849e-df04-4dcd-8408-11737ef8e7c6/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60400) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1032.738531] env[60400]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-95536e36-0c4d-44d1-a193-de0fdb2cd113 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.743626] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d14b2fa-b3fe-4978-acf1-a4c89f9da16a {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.752084] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b8cb7d5-4c36-46e9-9662-045023ed4f03 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.790326] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2accba4-f256-4e7d-b0d3-d9afde9d7c90 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.798669] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9e37179-0e45-4a95-bb47-2b9ce0bb535a {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.809265] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eca1e163-ddd7-49f5-8775-1e27b69687ec {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.819974] env[60400]: DEBUG nova.compute.provider_tree [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1032.829324] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b660982a-a054-47c1-be5d-d979b3c4bc4a {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.831341] env[60400]: DEBUG nova.scheduler.client.report [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1032.847525] env[60400]: DEBUG nova.virt.block_device [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Updating existing volume attachment record: 716255e7-fa9a-439e-85b7-828ce9acaee6 {{(pid=60400) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} [ 1032.852783] env[60400]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.416s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1032.860772] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.703s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1032.872560] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449870, 'name': CreateVM_Task, 'duration_secs': 0.261054} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1032.872774] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Created VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1032.873223] env[60400]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1032.873550] env[60400]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1032.873677] env[60400]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1032.874503] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8dcbc1e4-b05d-4a0e-86e5-228347fa5a0a {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.877360] env[60400]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Acquiring lock "799b5497-c8d2-4088-85d0-3d83952b5b72" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.._do_validation" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1032.877360] env[60400]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Lock "799b5497-c8d2-4088-85d0-3d83952b5b72" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.._do_validation" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1032.881735] env[60400]: DEBUG oslo_vmware.api [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Waiting for the task: (returnval){ [ 1032.881735] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]5280379e-bc60-8ed8-7ce1-abd7401e1da8" [ 1032.881735] env[60400]: _type = "Task" [ 1032.881735] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1032.885981] env[60400]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Lock "799b5497-c8d2-4088-85d0-3d83952b5b72" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.._do_validation" :: held 0.009s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1032.886506] env[60400]: DEBUG nova.compute.manager [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Start building networks asynchronously for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1032.894387] env[60400]: DEBUG oslo_vmware.api [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]5280379e-bc60-8ed8-7ce1-abd7401e1da8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1032.915019] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.053s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1032.915019] env[60400]: DEBUG nova.compute.utils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Instance 35630c7b-fdf4-4d6d-8e5a-0045f1387f93 could not be found. {{(pid=60400) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1032.915639] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.346s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1032.918789] env[60400]: DEBUG nova.compute.manager [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Instance disappeared during build. {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1032.918946] env[60400]: DEBUG nova.compute.manager [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Unplugging VIFs for instance {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1032.919133] env[60400]: DEBUG nova.compute.manager [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1032.919275] env[60400]: DEBUG nova.compute.manager [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Deallocating network for instance {{(pid=60400) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1032.919421] env[60400]: DEBUG nova.network.neutron [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] deallocate_for_instance() {{(pid=60400) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 1032.944134] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.028s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1032.944873] env[60400]: DEBUG nova.compute.utils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Instance 837197c0-9ff8-45a2-8bf0-730158a43a17 could not be found. {{(pid=60400) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1032.946847] env[60400]: DEBUG nova.compute.utils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Using /dev/sd instead of None {{(pid=60400) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1032.948280] env[60400]: DEBUG nova.compute.manager [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Instance disappeared during build. {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1032.948485] env[60400]: DEBUG nova.compute.manager [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Unplugging VIFs for instance {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1032.948656] env[60400]: DEBUG nova.compute.manager [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1032.948813] env[60400]: DEBUG nova.compute.manager [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Deallocating network for instance {{(pid=60400) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1032.948960] env[60400]: DEBUG nova.network.neutron [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] deallocate_for_instance() {{(pid=60400) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 1032.951329] env[60400]: DEBUG nova.compute.manager [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Allocating IP information in the background. {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1032.951487] env[60400]: DEBUG nova.network.neutron [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] allocate_for_instance() {{(pid=60400) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1032.956966] env[60400]: DEBUG nova.compute.manager [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Start building block device mappings for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1032.959942] env[60400]: DEBUG nova.network.neutron [req-d990310f-d334-4b0b-9579-341b0cd2a094 req-5b221815-e45d-42c4-bb90-5038a9d23458 service nova] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Updated VIF entry in instance network info cache for port 64fa08d6-5cd6-4437-b6ca-08257e3f0696. {{(pid=60400) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} [ 1032.960140] env[60400]: DEBUG nova.network.neutron [req-d990310f-d334-4b0b-9579-341b0cd2a094 req-5b221815-e45d-42c4-bb90-5038a9d23458 service nova] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Updating instance_info_cache with network_info: [{"id": "64fa08d6-5cd6-4437-b6ca-08257e3f0696", "address": "fa:16:3e:ee:c5:89", "network": {"id": "a080c599-4112-4b95-a2aa-a105bcae80e4", "bridge": "br-int", "label": "tempest-ServersTestJSON-1899793423-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6c15bcc07e0a4e4fa73b77d300814d00", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a874c214-8cdf-4a41-a718-84262b2a28d8", "external-id": "cl2-zone-726", "segmentation_id": 726, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap64fa08d6-5c", "ovs_interfaceid": "64fa08d6-5cd6-4437-b6ca-08257e3f0696", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1032.971326] env[60400]: DEBUG oslo_concurrency.lockutils [req-d990310f-d334-4b0b-9579-341b0cd2a094 req-5b221815-e45d-42c4-bb90-5038a9d23458 service nova] Releasing lock "refresh_cache-b5ad6145-8bf0-4aed-951b-eb11dd87ed7d" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1033.029212] env[60400]: DEBUG neutronclient.v2_0.client [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60400) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1033.031114] env[60400]: ERROR nova.compute.manager [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1033.031114] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Traceback (most recent call last): [ 1033.031114] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1033.031114] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1033.031114] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1033.031114] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] result = getattr(controller, method)(*args, **kwargs) [ 1033.031114] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1033.031114] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] return self._get(image_id) [ 1033.031114] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1033.031114] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1033.031114] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1033.031408] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] resp, body = self.http_client.get(url, headers=header) [ 1033.031408] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1033.031408] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] return self.request(url, 'GET', **kwargs) [ 1033.031408] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1033.031408] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] return self._handle_response(resp) [ 1033.031408] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1033.031408] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] raise exc.from_response(resp, resp.content) [ 1033.031408] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1033.031408] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] [ 1033.031408] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] During handling of the above exception, another exception occurred: [ 1033.031408] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] [ 1033.031408] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Traceback (most recent call last): [ 1033.031702] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1033.031702] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] self.driver.spawn(context, instance, image_meta, [ 1033.031702] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1033.031702] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1033.031702] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1033.031702] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] self._fetch_image_if_missing(context, vi) [ 1033.031702] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1033.031702] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] image_fetch(context, vi, tmp_image_ds_loc) [ 1033.031702] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1033.031702] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] images.fetch_image( [ 1033.031702] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1033.031702] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] metadata = IMAGE_API.get(context, image_ref) [ 1033.031702] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1033.032147] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] return session.show(context, image_id, [ 1033.032147] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1033.032147] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] _reraise_translated_image_exception(image_id) [ 1033.032147] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1033.032147] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] raise new_exc.with_traceback(exc_trace) [ 1033.032147] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1033.032147] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1033.032147] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1033.032147] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] result = getattr(controller, method)(*args, **kwargs) [ 1033.032147] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1033.032147] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] return self._get(image_id) [ 1033.032147] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1033.032147] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1033.032456] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1033.032456] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] resp, body = self.http_client.get(url, headers=header) [ 1033.032456] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1033.032456] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] return self.request(url, 'GET', **kwargs) [ 1033.032456] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1033.032456] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] return self._handle_response(resp) [ 1033.032456] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1033.032456] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] raise exc.from_response(resp, resp.content) [ 1033.032456] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] nova.exception.ImageNotAuthorized: Not authorized for image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9. [ 1033.032456] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] [ 1033.032456] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] During handling of the above exception, another exception occurred: [ 1033.032456] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] [ 1033.032456] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Traceback (most recent call last): [ 1033.032763] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1033.032763] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] self._build_and_run_instance(context, instance, image, [ 1033.032763] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1033.032763] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] with excutils.save_and_reraise_exception(): [ 1033.032763] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1033.032763] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] self.force_reraise() [ 1033.032763] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1033.032763] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] raise self.value [ 1033.032763] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1033.032763] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] with self.rt.instance_claim(context, instance, node, allocs, [ 1033.032763] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1033.032763] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] self.abort() [ 1033.032763] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/compute/claims.py", line 85, in abort [ 1033.033094] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] self.tracker.abort_instance_claim(self.context, self.instance, [ 1033.033094] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1033.033094] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] return f(*args, **kwargs) [ 1033.033094] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1033.033094] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] self._unset_instance_host_and_node(instance) [ 1033.033094] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1033.033094] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] instance.save() [ 1033.033094] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1033.033094] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] updates, result = self.indirection_api.object_action( [ 1033.033094] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1033.033094] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] return cctxt.call(context, 'object_action', objinst=objinst, [ 1033.033094] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1033.033094] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] result = self.transport._send( [ 1033.033397] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1033.033397] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] return self._driver.send(target, ctxt, message, [ 1033.033397] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1033.033397] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1033.033397] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1033.033397] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] raise result [ 1033.033397] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] nova.exception_Remote.InstanceNotFound_Remote: Instance 35630c7b-fdf4-4d6d-8e5a-0045f1387f93 could not be found. [ 1033.033397] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Traceback (most recent call last): [ 1033.033397] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] [ 1033.033397] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1033.033397] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] return getattr(target, method)(*args, **kwargs) [ 1033.033397] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] [ 1033.033397] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1033.033695] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] return fn(self, *args, **kwargs) [ 1033.033695] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] [ 1033.033695] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1033.033695] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] old_ref, inst_ref = db.instance_update_and_get_original( [ 1033.033695] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] [ 1033.033695] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1033.033695] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] return f(*args, **kwargs) [ 1033.033695] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] [ 1033.033695] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1033.033695] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] with excutils.save_and_reraise_exception() as ectxt: [ 1033.033695] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] [ 1033.033695] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1033.033695] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] self.force_reraise() [ 1033.033695] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] [ 1033.033695] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1033.034067] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] raise self.value [ 1033.034067] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] [ 1033.034067] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1033.034067] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] return f(*args, **kwargs) [ 1033.034067] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] [ 1033.034067] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1033.034067] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] return f(context, *args, **kwargs) [ 1033.034067] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] [ 1033.034067] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1033.034067] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1033.034067] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] [ 1033.034067] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1033.034067] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] raise exception.InstanceNotFound(instance_id=uuid) [ 1033.034067] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] [ 1033.034067] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] nova.exception.InstanceNotFound: Instance 35630c7b-fdf4-4d6d-8e5a-0045f1387f93 could not be found. [ 1033.034451] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] [ 1033.034451] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] [ 1033.034451] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] During handling of the above exception, another exception occurred: [ 1033.034451] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] [ 1033.034451] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Traceback (most recent call last): [ 1033.034451] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1033.034451] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] ret = obj(*args, **kwargs) [ 1033.034451] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1033.034451] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] exception_handler_v20(status_code, error_body) [ 1033.034451] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1033.034451] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] raise client_exc(message=error_message, [ 1033.034451] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1033.034451] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Neutron server returns request_ids: ['req-1f1b5dc1-1290-4bf3-b850-56f44607776a'] [ 1033.034451] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] [ 1033.034780] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] During handling of the above exception, another exception occurred: [ 1033.034780] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] [ 1033.034780] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Traceback (most recent call last): [ 1033.034780] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1033.034780] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] self._deallocate_network(context, instance, requested_networks) [ 1033.034780] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1033.034780] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] self.network_api.deallocate_for_instance( [ 1033.034780] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/network/neutron.py", line 1798, in deallocate_for_instance [ 1033.034780] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] data = neutron.list_ports(**search_opts) [ 1033.034780] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1033.034780] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] ret = obj(*args, **kwargs) [ 1033.034780] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1033.034780] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] return self.list('ports', self.ports_path, retrieve_all, [ 1033.035658] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1033.035658] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] ret = obj(*args, **kwargs) [ 1033.035658] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1033.035658] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] for r in self._pagination(collection, path, **params): [ 1033.035658] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1033.035658] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] res = self.get(path, params=params) [ 1033.035658] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1033.035658] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] ret = obj(*args, **kwargs) [ 1033.035658] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1033.035658] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] return self.retry_request("GET", action, body=body, [ 1033.035658] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1033.035658] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] ret = obj(*args, **kwargs) [ 1033.035658] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1033.036034] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] return self.do_request(method, action, body=body, [ 1033.036034] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1033.036034] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] ret = obj(*args, **kwargs) [ 1033.036034] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1033.036034] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] self._handle_fault_response(status_code, replybody, resp) [ 1033.036034] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1033.036034] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] raise exception.Unauthorized() [ 1033.036034] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] nova.exception.Unauthorized: Not authorized. [ 1033.036034] env[60400]: ERROR nova.compute.manager [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] [ 1033.038497] env[60400]: DEBUG nova.compute.manager [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Start spawning the instance on the hypervisor. {{(pid=60400) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1033.061217] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Lock "35630c7b-fdf4-4d6d-8e5a-0045f1387f93" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 314.075s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1033.095840] env[60400]: DEBUG nova.policy [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fe442deed75545e8a0c44706c74a99ff', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6d40db2e2f5c492f92f6943a058f1412', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60400) authorize /opt/stack/nova/nova/policy.py:203}} [ 1033.192955] env[60400]: DEBUG nova.virt.hardware [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T04:32:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T04:32:17Z,direct_url=,disk_format='vmdk',id=f5dfd970-7a56-4489-873c-2c3b6fbd9fe9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c82f07917ba4819a6bcf09e15f9f9cf',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T04:32:18Z,virtual_size=,visibility=), allow threads: False {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 1033.194034] env[60400]: DEBUG nova.virt.hardware [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Flavor limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 1033.194034] env[60400]: DEBUG nova.virt.hardware [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Image limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 1033.194255] env[60400]: DEBUG nova.virt.hardware [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Flavor pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 1033.198016] env[60400]: DEBUG nova.virt.hardware [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Image pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 1033.198016] env[60400]: DEBUG nova.virt.hardware [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 1033.198016] env[60400]: DEBUG nova.virt.hardware [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 1033.198016] env[60400]: DEBUG nova.virt.hardware [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 1033.198016] env[60400]: DEBUG nova.virt.hardware [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Got 1 possible topologies {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 1033.198209] env[60400]: DEBUG nova.virt.hardware [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 1033.198209] env[60400]: DEBUG nova.virt.hardware [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 1033.198209] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70c3f7ec-b819-4a6f-9ef3-2a4625409888 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1033.201889] env[60400]: DEBUG neutronclient.v2_0.client [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60400) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1033.204054] env[60400]: ERROR nova.compute.manager [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1033.204054] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Traceback (most recent call last): [ 1033.204054] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1033.204054] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1033.204054] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1033.204054] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] result = getattr(controller, method)(*args, **kwargs) [ 1033.204054] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1033.204054] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] return self._get(image_id) [ 1033.204054] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1033.204054] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1033.204054] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1033.204054] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] resp, body = self.http_client.get(url, headers=header) [ 1033.204435] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1033.204435] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] return self.request(url, 'GET', **kwargs) [ 1033.204435] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1033.204435] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] return self._handle_response(resp) [ 1033.204435] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1033.204435] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] raise exc.from_response(resp, resp.content) [ 1033.204435] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1033.204435] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] [ 1033.204435] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] During handling of the above exception, another exception occurred: [ 1033.204435] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] [ 1033.204435] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Traceback (most recent call last): [ 1033.204435] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1033.204758] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] self.driver.spawn(context, instance, image_meta, [ 1033.204758] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1033.204758] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1033.204758] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1033.204758] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] self._fetch_image_if_missing(context, vi) [ 1033.204758] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1033.204758] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] image_fetch(context, vi, tmp_image_ds_loc) [ 1033.204758] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1033.204758] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] images.fetch_image( [ 1033.204758] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1033.204758] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] metadata = IMAGE_API.get(context, image_ref) [ 1033.204758] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1033.204758] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] return session.show(context, image_id, [ 1033.205107] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1033.205107] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] _reraise_translated_image_exception(image_id) [ 1033.205107] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1033.205107] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] raise new_exc.with_traceback(exc_trace) [ 1033.205107] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1033.205107] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1033.205107] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1033.205107] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] result = getattr(controller, method)(*args, **kwargs) [ 1033.205107] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1033.205107] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] return self._get(image_id) [ 1033.205107] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1033.205107] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1033.205107] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1033.205437] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] resp, body = self.http_client.get(url, headers=header) [ 1033.205437] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1033.205437] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] return self.request(url, 'GET', **kwargs) [ 1033.205437] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1033.205437] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] return self._handle_response(resp) [ 1033.205437] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1033.205437] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] raise exc.from_response(resp, resp.content) [ 1033.205437] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] nova.exception.ImageNotAuthorized: Not authorized for image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9. [ 1033.205437] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] [ 1033.205437] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] During handling of the above exception, another exception occurred: [ 1033.205437] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] [ 1033.205437] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Traceback (most recent call last): [ 1033.205437] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1033.205737] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] self._build_and_run_instance(context, instance, image, [ 1033.205737] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1033.205737] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] with excutils.save_and_reraise_exception(): [ 1033.205737] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1033.205737] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] self.force_reraise() [ 1033.205737] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1033.205737] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] raise self.value [ 1033.205737] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1033.205737] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] with self.rt.instance_claim(context, instance, node, allocs, [ 1033.205737] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1033.205737] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] self.abort() [ 1033.205737] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/compute/claims.py", line 85, in abort [ 1033.205737] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] self.tracker.abort_instance_claim(self.context, self.instance, [ 1033.206147] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1033.206147] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] return f(*args, **kwargs) [ 1033.206147] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1033.206147] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] self._unset_instance_host_and_node(instance) [ 1033.206147] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1033.206147] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] instance.save() [ 1033.206147] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1033.206147] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] updates, result = self.indirection_api.object_action( [ 1033.206147] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1033.206147] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] return cctxt.call(context, 'object_action', objinst=objinst, [ 1033.206147] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1033.206147] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] result = self.transport._send( [ 1033.206431] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1033.206431] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] return self._driver.send(target, ctxt, message, [ 1033.206431] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1033.206431] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1033.206431] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1033.206431] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] raise result [ 1033.206431] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] nova.exception_Remote.InstanceNotFound_Remote: Instance 837197c0-9ff8-45a2-8bf0-730158a43a17 could not be found. [ 1033.206431] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Traceback (most recent call last): [ 1033.206431] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] [ 1033.206431] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1033.206431] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] return getattr(target, method)(*args, **kwargs) [ 1033.206431] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] [ 1033.206431] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1033.206780] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] return fn(self, *args, **kwargs) [ 1033.206780] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] [ 1033.206780] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1033.206780] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] old_ref, inst_ref = db.instance_update_and_get_original( [ 1033.206780] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] [ 1033.206780] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1033.206780] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] return f(*args, **kwargs) [ 1033.206780] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] [ 1033.206780] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1033.206780] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] with excutils.save_and_reraise_exception() as ectxt: [ 1033.206780] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] [ 1033.206780] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1033.206780] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] self.force_reraise() [ 1033.206780] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] [ 1033.206780] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1033.207171] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] raise self.value [ 1033.207171] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] [ 1033.207171] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1033.207171] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] return f(*args, **kwargs) [ 1033.207171] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] [ 1033.207171] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1033.207171] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] return f(context, *args, **kwargs) [ 1033.207171] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] [ 1033.207171] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1033.207171] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1033.207171] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] [ 1033.207171] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1033.207171] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] raise exception.InstanceNotFound(instance_id=uuid) [ 1033.207171] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] [ 1033.207171] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] nova.exception.InstanceNotFound: Instance 837197c0-9ff8-45a2-8bf0-730158a43a17 could not be found. [ 1033.207549] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] [ 1033.207549] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] [ 1033.207549] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] During handling of the above exception, another exception occurred: [ 1033.207549] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] [ 1033.207549] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Traceback (most recent call last): [ 1033.207549] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1033.207549] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] ret = obj(*args, **kwargs) [ 1033.207549] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1033.207549] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] exception_handler_v20(status_code, error_body) [ 1033.207549] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1033.207549] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] raise client_exc(message=error_message, [ 1033.207549] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1033.207549] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Neutron server returns request_ids: ['req-e7cb33aa-73f3-4acc-b095-57dd57ac4c6e'] [ 1033.207549] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] [ 1033.207914] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] During handling of the above exception, another exception occurred: [ 1033.207914] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] [ 1033.207914] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Traceback (most recent call last): [ 1033.207914] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1033.207914] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] self._deallocate_network(context, instance, requested_networks) [ 1033.207914] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1033.207914] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] self.network_api.deallocate_for_instance( [ 1033.207914] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/network/neutron.py", line 1798, in deallocate_for_instance [ 1033.207914] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] data = neutron.list_ports(**search_opts) [ 1033.207914] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1033.207914] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] ret = obj(*args, **kwargs) [ 1033.207914] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1033.207914] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] return self.list('ports', self.ports_path, retrieve_all, [ 1033.211539] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1033.211539] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] ret = obj(*args, **kwargs) [ 1033.211539] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1033.211539] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] for r in self._pagination(collection, path, **params): [ 1033.211539] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1033.211539] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] res = self.get(path, params=params) [ 1033.211539] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1033.211539] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] ret = obj(*args, **kwargs) [ 1033.211539] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1033.211539] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] return self.retry_request("GET", action, body=body, [ 1033.211539] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1033.211539] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] ret = obj(*args, **kwargs) [ 1033.211539] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1033.211971] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] return self.do_request(method, action, body=body, [ 1033.211971] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1033.211971] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] ret = obj(*args, **kwargs) [ 1033.211971] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1033.211971] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] self._handle_fault_response(status_code, replybody, resp) [ 1033.211971] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1033.211971] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] raise exception.Unauthorized() [ 1033.211971] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] nova.exception.Unauthorized: Not authorized. [ 1033.211971] env[60400]: ERROR nova.compute.manager [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] [ 1033.212590] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a962e514-6f7a-4aae-8b23-971321689a8d {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1033.246456] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Lock "837197c0-9ff8-45a2-8bf0-730158a43a17" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 314.245s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1033.261056] env[60400]: DEBUG nova.compute.manager [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Start spawning the instance on the hypervisor. {{(pid=60400) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1033.261605] env[60400]: DEBUG nova.virt.hardware [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T04:32:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format=,created_at=,direct_url=,disk_format=,id=,min_disk=0,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=,virtual_size=,visibility=), allow threads: False {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 1033.261823] env[60400]: DEBUG nova.virt.hardware [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Flavor limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 1033.261963] env[60400]: DEBUG nova.virt.hardware [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Image limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 1033.262249] env[60400]: DEBUG nova.virt.hardware [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Flavor pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 1033.262431] env[60400]: DEBUG nova.virt.hardware [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Image pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 1033.262611] env[60400]: DEBUG nova.virt.hardware [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 1033.262891] env[60400]: DEBUG nova.virt.hardware [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 1033.263283] env[60400]: DEBUG nova.virt.hardware [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 1033.267022] env[60400]: DEBUG nova.virt.hardware [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Got 1 possible topologies {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 1033.267022] env[60400]: DEBUG nova.virt.hardware [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 1033.267022] env[60400]: DEBUG nova.virt.hardware [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 1033.267022] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5996bc28-28ad-4d07-8a68-ba3e47faa21b {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1033.275126] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df882853-5e3e-4a81-a08f-1c4b711a9624 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1033.394407] env[60400]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1033.395701] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Processing image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1033.395701] env[60400]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1033.409976] env[60400]: DEBUG nova.network.neutron [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Successfully created port: 07fc5d48-3095-4921-945f-cb713ab9fcdc {{(pid=60400) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1033.759368] env[60400]: DEBUG nova.network.neutron [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Successfully created port: 312be5f9-1d4e-4308-bc4a-71c10a1778b6 {{(pid=60400) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1034.379294] env[60400]: DEBUG nova.compute.manager [req-387fd234-79a7-409d-9e1c-a5f41bafdc79 req-68ef9015-2256-4cab-ba0c-cc2027211ff2 service nova] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Received event network-vif-plugged-07fc5d48-3095-4921-945f-cb713ab9fcdc {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1034.379294] env[60400]: DEBUG oslo_concurrency.lockutils [req-387fd234-79a7-409d-9e1c-a5f41bafdc79 req-68ef9015-2256-4cab-ba0c-cc2027211ff2 service nova] Acquiring lock "311eb356-b844-4b1b-a0f0-ed7da6bb9f1d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1034.379294] env[60400]: DEBUG oslo_concurrency.lockutils [req-387fd234-79a7-409d-9e1c-a5f41bafdc79 req-68ef9015-2256-4cab-ba0c-cc2027211ff2 service nova] Lock "311eb356-b844-4b1b-a0f0-ed7da6bb9f1d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1034.379294] env[60400]: DEBUG oslo_concurrency.lockutils [req-387fd234-79a7-409d-9e1c-a5f41bafdc79 req-68ef9015-2256-4cab-ba0c-cc2027211ff2 service nova] Lock "311eb356-b844-4b1b-a0f0-ed7da6bb9f1d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1034.381497] env[60400]: DEBUG nova.compute.manager [req-387fd234-79a7-409d-9e1c-a5f41bafdc79 req-68ef9015-2256-4cab-ba0c-cc2027211ff2 service nova] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] No waiting events found dispatching network-vif-plugged-07fc5d48-3095-4921-945f-cb713ab9fcdc {{(pid=60400) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1034.381497] env[60400]: WARNING nova.compute.manager [req-387fd234-79a7-409d-9e1c-a5f41bafdc79 req-68ef9015-2256-4cab-ba0c-cc2027211ff2 service nova] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Received unexpected event network-vif-plugged-07fc5d48-3095-4921-945f-cb713ab9fcdc for instance with vm_state building and task_state spawning. [ 1034.502539] env[60400]: DEBUG nova.network.neutron [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Successfully updated port: 07fc5d48-3095-4921-945f-cb713ab9fcdc {{(pid=60400) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1034.518259] env[60400]: DEBUG oslo_concurrency.lockutils [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Acquiring lock "refresh_cache-311eb356-b844-4b1b-a0f0-ed7da6bb9f1d" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1034.518259] env[60400]: DEBUG oslo_concurrency.lockutils [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Acquired lock "refresh_cache-311eb356-b844-4b1b-a0f0-ed7da6bb9f1d" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1034.518259] env[60400]: DEBUG nova.network.neutron [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Building network info cache for instance {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} [ 1034.794274] env[60400]: DEBUG nova.network.neutron [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Instance cache missing network info. {{(pid=60400) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 1035.145706] env[60400]: DEBUG nova.network.neutron [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Updating instance_info_cache with network_info: [{"id": "07fc5d48-3095-4921-945f-cb713ab9fcdc", "address": "fa:16:3e:95:a7:75", "network": {"id": "31b4b525-456e-41c3-bf38-7b683e7f96cb", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-85131478-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a1fb7769ccc2463094e0dd138a59226e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "274afb4c-04df-4213-8ad2-8f48a10d78a8", "external-id": "nsx-vlan-transportzone-515", "segmentation_id": 515, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap07fc5d48-30", "ovs_interfaceid": "07fc5d48-3095-4921-945f-cb713ab9fcdc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1035.157110] env[60400]: DEBUG oslo_concurrency.lockutils [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Releasing lock "refresh_cache-311eb356-b844-4b1b-a0f0-ed7da6bb9f1d" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1035.157110] env[60400]: DEBUG nova.compute.manager [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Instance network_info: |[{"id": "07fc5d48-3095-4921-945f-cb713ab9fcdc", "address": "fa:16:3e:95:a7:75", "network": {"id": "31b4b525-456e-41c3-bf38-7b683e7f96cb", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-85131478-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a1fb7769ccc2463094e0dd138a59226e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "274afb4c-04df-4213-8ad2-8f48a10d78a8", "external-id": "nsx-vlan-transportzone-515", "segmentation_id": 515, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap07fc5d48-30", "ovs_interfaceid": "07fc5d48-3095-4921-945f-cb713ab9fcdc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1035.157305] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:95:a7:75', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '274afb4c-04df-4213-8ad2-8f48a10d78a8', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '07fc5d48-3095-4921-945f-cb713ab9fcdc', 'vif_model': 'vmxnet3'}] {{(pid=60400) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1035.168020] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Creating folder: Project (a1fb7769ccc2463094e0dd138a59226e). Parent ref: group-v119075. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1035.168020] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b77d4ed2-c376-43f0-9c1e-952ebf284b2c {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1035.184019] env[60400]: WARNING suds.client [-] Web service reported a SOAP processing fault using an unexpected HTTP status code 200. Reporting as an internal server error. [ 1035.184019] env[60400]: DEBUG oslo_vmware.api [-] Fault list: [DuplicateName] {{(pid=60400) _invoke_api /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:337}} [ 1035.184019] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Folder already exists: Project (a1fb7769ccc2463094e0dd138a59226e). Parent ref: group-v119075. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1599}} [ 1035.184019] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Creating folder: Instances. Parent ref: group-v119128. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1035.184019] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7f7cc6c2-71d4-48d1-ae80-4f33866337e8 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1035.195061] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Created folder: Instances in parent group-v119128. [ 1035.195061] env[60400]: DEBUG oslo.service.loopingcall [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60400) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1035.195061] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Creating VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1035.195061] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ef59b985-e1e1-47fd-a7c1-0c86360bc2b7 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1035.218184] env[60400]: DEBUG nova.network.neutron [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Successfully updated port: 312be5f9-1d4e-4308-bc4a-71c10a1778b6 {{(pid=60400) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1035.226884] env[60400]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1035.226884] env[60400]: value = "task-449873" [ 1035.226884] env[60400]: _type = "Task" [ 1035.226884] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1035.233031] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449873, 'name': CreateVM_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1035.233993] env[60400]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Acquiring lock "refresh_cache-e924a9ab-71c1-4efe-a217-b036ec785dc8" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1035.233993] env[60400]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Acquired lock "refresh_cache-e924a9ab-71c1-4efe-a217-b036ec785dc8" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1035.234191] env[60400]: DEBUG nova.network.neutron [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Building network info cache for instance {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} [ 1035.300938] env[60400]: DEBUG nova.network.neutron [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Instance cache missing network info. {{(pid=60400) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 1035.616022] env[60400]: DEBUG nova.network.neutron [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Updating instance_info_cache with network_info: [{"id": "312be5f9-1d4e-4308-bc4a-71c10a1778b6", "address": "fa:16:3e:b3:3c:8a", "network": {"id": "1d5275d1-9dc9-40ae-a052-b2eec7242df0", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-621915561-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6d40db2e2f5c492f92f6943a058f1412", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b5215e5b-294b-4e8c-bd06-355e9955ab1d", "external-id": "nsx-vlan-transportzone-529", "segmentation_id": 529, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap312be5f9-1d", "ovs_interfaceid": "312be5f9-1d4e-4308-bc4a-71c10a1778b6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1035.631024] env[60400]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Releasing lock "refresh_cache-e924a9ab-71c1-4efe-a217-b036ec785dc8" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1035.631024] env[60400]: DEBUG nova.compute.manager [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Instance network_info: |[{"id": "312be5f9-1d4e-4308-bc4a-71c10a1778b6", "address": "fa:16:3e:b3:3c:8a", "network": {"id": "1d5275d1-9dc9-40ae-a052-b2eec7242df0", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-621915561-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6d40db2e2f5c492f92f6943a058f1412", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b5215e5b-294b-4e8c-bd06-355e9955ab1d", "external-id": "nsx-vlan-transportzone-529", "segmentation_id": 529, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap312be5f9-1d", "ovs_interfaceid": "312be5f9-1d4e-4308-bc4a-71c10a1778b6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1035.631282] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b3:3c:8a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'b5215e5b-294b-4e8c-bd06-355e9955ab1d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '312be5f9-1d4e-4308-bc4a-71c10a1778b6', 'vif_model': 'vmxnet3'}] {{(pid=60400) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1035.637480] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Creating folder: Project (6d40db2e2f5c492f92f6943a058f1412). Parent ref: group-v119075. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1035.638030] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6b11a6dc-e693-4780-84e6-d8efbfcabb75 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1035.649678] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Created folder: Project (6d40db2e2f5c492f92f6943a058f1412) in parent group-v119075. [ 1035.649887] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Creating folder: Instances. Parent ref: group-v119147. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1035.650122] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3d63e6b1-16aa-42a1-8683-047487339db5 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1035.660083] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Created folder: Instances in parent group-v119147. [ 1035.660187] env[60400]: DEBUG oslo.service.loopingcall [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60400) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1035.660340] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Creating VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1035.660737] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6ec8d53a-24e0-4629-b2f2-06595fe59181 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1035.683204] env[60400]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1035.683204] env[60400]: value = "task-449876" [ 1035.683204] env[60400]: _type = "Task" [ 1035.683204] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1035.694970] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449876, 'name': CreateVM_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1035.733876] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449873, 'name': CreateVM_Task, 'duration_secs': 0.36198} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1035.734208] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Created VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1035.734885] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Block device information present: {'root_device_name': '/dev/sda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'disk_bus': None, 'delete_on_termination': True, 'device_type': None, 'guest_format': None, 'attachment_id': '716255e7-fa9a-439e-85b7-828ce9acaee6', 'connection_info': {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-119131', 'volume_id': '12c96912-3d03-4b7d-9d94-8ff71c6cc5d0', 'name': 'volume-12c96912-3d03-4b7d-9d94-8ff71c6cc5d0', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '311eb356-b844-4b1b-a0f0-ed7da6bb9f1d', 'attached_at': '', 'detached_at': '', 'volume_id': '12c96912-3d03-4b7d-9d94-8ff71c6cc5d0', 'serial': '12c96912-3d03-4b7d-9d94-8ff71c6cc5d0'}, 'mount_device': '/dev/sda', 'boot_index': 0, 'volume_type': None}], 'swap': None} {{(pid=60400) spawn /opt/stack/nova/nova/virt/vmwareapi/vmops.py:799}} [ 1035.735238] env[60400]: DEBUG nova.virt.vmwareapi.volumeops [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Root volume attach. Driver type: vmdk {{(pid=60400) attach_root_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:661}} [ 1035.736087] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6aafa99-8a2f-4a3d-bf77-77761a7a02d6 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1035.744953] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eec77a2c-60dc-43f2-8b2f-4e2fafe31f65 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1035.751348] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4418cf3d-2a37-4073-b0e1-0b26e786734e {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1035.757634] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.RelocateVM_Task with opID=oslo.vmware-9a7f86e1-9db2-449f-a64c-1181a8d24737 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1035.764624] env[60400]: DEBUG oslo_vmware.api [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Waiting for the task: (returnval){ [ 1035.764624] env[60400]: value = "task-449877" [ 1035.764624] env[60400]: _type = "Task" [ 1035.764624] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1035.774116] env[60400]: DEBUG oslo_vmware.api [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Task: {'id': task-449877, 'name': RelocateVM_Task} progress is 5%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1036.193012] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449876, 'name': CreateVM_Task, 'duration_secs': 0.305057} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1036.193274] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Created VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1036.193920] env[60400]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1036.194087] env[60400]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1036.194401] env[60400]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1036.194640] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-53b7928c-9600-4340-936d-b8a88dde22a4 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1036.199234] env[60400]: DEBUG oslo_vmware.api [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Waiting for the task: (returnval){ [ 1036.199234] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52e7fcd7-15b0-d403-7a79-774a6238d135" [ 1036.199234] env[60400]: _type = "Task" [ 1036.199234] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1036.206737] env[60400]: DEBUG oslo_vmware.api [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52e7fcd7-15b0-d403-7a79-774a6238d135, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1036.274471] env[60400]: DEBUG oslo_vmware.api [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Task: {'id': task-449877, 'name': RelocateVM_Task, 'duration_secs': 0.0278} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1036.274867] env[60400]: DEBUG nova.virt.vmwareapi.volumeops [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Volume attach. Driver type: vmdk {{(pid=60400) attach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:439}} [ 1036.275095] env[60400]: DEBUG nova.virt.vmwareapi.volumeops [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] _attach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-119131', 'volume_id': '12c96912-3d03-4b7d-9d94-8ff71c6cc5d0', 'name': 'volume-12c96912-3d03-4b7d-9d94-8ff71c6cc5d0', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '311eb356-b844-4b1b-a0f0-ed7da6bb9f1d', 'attached_at': '', 'detached_at': '', 'volume_id': '12c96912-3d03-4b7d-9d94-8ff71c6cc5d0', 'serial': '12c96912-3d03-4b7d-9d94-8ff71c6cc5d0'} {{(pid=60400) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:336}} [ 1036.275951] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-863835c4-d9a6-40e2-81ab-7b9f08017b94 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1036.293652] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15a128ec-3796-4833-b0b5-bac9524ffb0e {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1036.318631] env[60400]: DEBUG nova.virt.vmwareapi.volumeops [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Reconfiguring VM instance instance-0000001c to attach disk [datastore1] volume-12c96912-3d03-4b7d-9d94-8ff71c6cc5d0/volume-12c96912-3d03-4b7d-9d94-8ff71c6cc5d0.vmdk or device None with type thin {{(pid=60400) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:81}} [ 1036.318937] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-2dc7896c-3fc5-44ff-9b51-6abbd7af1efa {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1036.340220] env[60400]: DEBUG oslo_vmware.api [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Waiting for the task: (returnval){ [ 1036.340220] env[60400]: value = "task-449878" [ 1036.340220] env[60400]: _type = "Task" [ 1036.340220] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1036.350244] env[60400]: DEBUG oslo_vmware.api [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Task: {'id': task-449878, 'name': ReconfigVM_Task} progress is 6%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1036.411251] env[60400]: DEBUG nova.compute.manager [req-50be686f-8f09-49f6-9a9a-75f9ba46dc15 req-63ecc84f-6b40-4386-a160-5c9e9498e5bb service nova] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Received event network-changed-07fc5d48-3095-4921-945f-cb713ab9fcdc {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1036.411527] env[60400]: DEBUG nova.compute.manager [req-50be686f-8f09-49f6-9a9a-75f9ba46dc15 req-63ecc84f-6b40-4386-a160-5c9e9498e5bb service nova] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Refreshing instance network info cache due to event network-changed-07fc5d48-3095-4921-945f-cb713ab9fcdc. {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 1036.411746] env[60400]: DEBUG oslo_concurrency.lockutils [req-50be686f-8f09-49f6-9a9a-75f9ba46dc15 req-63ecc84f-6b40-4386-a160-5c9e9498e5bb service nova] Acquiring lock "refresh_cache-311eb356-b844-4b1b-a0f0-ed7da6bb9f1d" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1036.411849] env[60400]: DEBUG oslo_concurrency.lockutils [req-50be686f-8f09-49f6-9a9a-75f9ba46dc15 req-63ecc84f-6b40-4386-a160-5c9e9498e5bb service nova] Acquired lock "refresh_cache-311eb356-b844-4b1b-a0f0-ed7da6bb9f1d" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1036.411979] env[60400]: DEBUG nova.network.neutron [req-50be686f-8f09-49f6-9a9a-75f9ba46dc15 req-63ecc84f-6b40-4386-a160-5c9e9498e5bb service nova] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Refreshing network info cache for port 07fc5d48-3095-4921-945f-cb713ab9fcdc {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} [ 1036.709907] env[60400]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1036.710203] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Processing image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1036.710415] env[60400]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1036.718020] env[60400]: DEBUG nova.network.neutron [req-50be686f-8f09-49f6-9a9a-75f9ba46dc15 req-63ecc84f-6b40-4386-a160-5c9e9498e5bb service nova] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Updated VIF entry in instance network info cache for port 07fc5d48-3095-4921-945f-cb713ab9fcdc. {{(pid=60400) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} [ 1036.718020] env[60400]: DEBUG nova.network.neutron [req-50be686f-8f09-49f6-9a9a-75f9ba46dc15 req-63ecc84f-6b40-4386-a160-5c9e9498e5bb service nova] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Updating instance_info_cache with network_info: [{"id": "07fc5d48-3095-4921-945f-cb713ab9fcdc", "address": "fa:16:3e:95:a7:75", "network": {"id": "31b4b525-456e-41c3-bf38-7b683e7f96cb", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-85131478-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a1fb7769ccc2463094e0dd138a59226e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "274afb4c-04df-4213-8ad2-8f48a10d78a8", "external-id": "nsx-vlan-transportzone-515", "segmentation_id": 515, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap07fc5d48-30", "ovs_interfaceid": "07fc5d48-3095-4921-945f-cb713ab9fcdc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1036.725895] env[60400]: DEBUG oslo_concurrency.lockutils [req-50be686f-8f09-49f6-9a9a-75f9ba46dc15 req-63ecc84f-6b40-4386-a160-5c9e9498e5bb service nova] Releasing lock "refresh_cache-311eb356-b844-4b1b-a0f0-ed7da6bb9f1d" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1036.726132] env[60400]: DEBUG nova.compute.manager [req-50be686f-8f09-49f6-9a9a-75f9ba46dc15 req-63ecc84f-6b40-4386-a160-5c9e9498e5bb service nova] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Received event network-vif-plugged-312be5f9-1d4e-4308-bc4a-71c10a1778b6 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1036.726308] env[60400]: DEBUG oslo_concurrency.lockutils [req-50be686f-8f09-49f6-9a9a-75f9ba46dc15 req-63ecc84f-6b40-4386-a160-5c9e9498e5bb service nova] Acquiring lock "e924a9ab-71c1-4efe-a217-b036ec785dc8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1036.726491] env[60400]: DEBUG oslo_concurrency.lockutils [req-50be686f-8f09-49f6-9a9a-75f9ba46dc15 req-63ecc84f-6b40-4386-a160-5c9e9498e5bb service nova] Lock "e924a9ab-71c1-4efe-a217-b036ec785dc8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1036.726638] env[60400]: DEBUG oslo_concurrency.lockutils [req-50be686f-8f09-49f6-9a9a-75f9ba46dc15 req-63ecc84f-6b40-4386-a160-5c9e9498e5bb service nova] Lock "e924a9ab-71c1-4efe-a217-b036ec785dc8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1036.726790] env[60400]: DEBUG nova.compute.manager [req-50be686f-8f09-49f6-9a9a-75f9ba46dc15 req-63ecc84f-6b40-4386-a160-5c9e9498e5bb service nova] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] No waiting events found dispatching network-vif-plugged-312be5f9-1d4e-4308-bc4a-71c10a1778b6 {{(pid=60400) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1036.726947] env[60400]: WARNING nova.compute.manager [req-50be686f-8f09-49f6-9a9a-75f9ba46dc15 req-63ecc84f-6b40-4386-a160-5c9e9498e5bb service nova] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Received unexpected event network-vif-plugged-312be5f9-1d4e-4308-bc4a-71c10a1778b6 for instance with vm_state building and task_state spawning. [ 1036.727116] env[60400]: DEBUG nova.compute.manager [req-50be686f-8f09-49f6-9a9a-75f9ba46dc15 req-63ecc84f-6b40-4386-a160-5c9e9498e5bb service nova] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Received event network-changed-312be5f9-1d4e-4308-bc4a-71c10a1778b6 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1036.727263] env[60400]: DEBUG nova.compute.manager [req-50be686f-8f09-49f6-9a9a-75f9ba46dc15 req-63ecc84f-6b40-4386-a160-5c9e9498e5bb service nova] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Refreshing instance network info cache due to event network-changed-312be5f9-1d4e-4308-bc4a-71c10a1778b6. {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 1036.727429] env[60400]: DEBUG oslo_concurrency.lockutils [req-50be686f-8f09-49f6-9a9a-75f9ba46dc15 req-63ecc84f-6b40-4386-a160-5c9e9498e5bb service nova] Acquiring lock "refresh_cache-e924a9ab-71c1-4efe-a217-b036ec785dc8" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1036.727558] env[60400]: DEBUG oslo_concurrency.lockutils [req-50be686f-8f09-49f6-9a9a-75f9ba46dc15 req-63ecc84f-6b40-4386-a160-5c9e9498e5bb service nova] Acquired lock "refresh_cache-e924a9ab-71c1-4efe-a217-b036ec785dc8" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1036.727707] env[60400]: DEBUG nova.network.neutron [req-50be686f-8f09-49f6-9a9a-75f9ba46dc15 req-63ecc84f-6b40-4386-a160-5c9e9498e5bb service nova] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Refreshing network info cache for port 312be5f9-1d4e-4308-bc4a-71c10a1778b6 {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} [ 1036.860534] env[60400]: DEBUG oslo_vmware.api [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Task: {'id': task-449878, 'name': ReconfigVM_Task, 'duration_secs': 0.275117} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1036.860534] env[60400]: DEBUG nova.virt.vmwareapi.volumeops [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Reconfigured VM instance instance-0000001c to attach disk [datastore1] volume-12c96912-3d03-4b7d-9d94-8ff71c6cc5d0/volume-12c96912-3d03-4b7d-9d94-8ff71c6cc5d0.vmdk or device None with type thin {{(pid=60400) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:88}} [ 1036.864316] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-a8be8c8f-2ae0-4264-8bbb-8ff0ab1bcf42 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1036.886017] env[60400]: DEBUG oslo_vmware.api [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Waiting for the task: (returnval){ [ 1036.886017] env[60400]: value = "task-449879" [ 1036.886017] env[60400]: _type = "Task" [ 1036.886017] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1036.892503] env[60400]: DEBUG oslo_vmware.api [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Task: {'id': task-449879, 'name': ReconfigVM_Task} progress is 5%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1037.303737] env[60400]: DEBUG nova.network.neutron [req-50be686f-8f09-49f6-9a9a-75f9ba46dc15 req-63ecc84f-6b40-4386-a160-5c9e9498e5bb service nova] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Updated VIF entry in instance network info cache for port 312be5f9-1d4e-4308-bc4a-71c10a1778b6. {{(pid=60400) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} [ 1037.304843] env[60400]: DEBUG nova.network.neutron [req-50be686f-8f09-49f6-9a9a-75f9ba46dc15 req-63ecc84f-6b40-4386-a160-5c9e9498e5bb service nova] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Updating instance_info_cache with network_info: [{"id": "312be5f9-1d4e-4308-bc4a-71c10a1778b6", "address": "fa:16:3e:b3:3c:8a", "network": {"id": "1d5275d1-9dc9-40ae-a052-b2eec7242df0", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-621915561-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6d40db2e2f5c492f92f6943a058f1412", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b5215e5b-294b-4e8c-bd06-355e9955ab1d", "external-id": "nsx-vlan-transportzone-529", "segmentation_id": 529, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap312be5f9-1d", "ovs_interfaceid": "312be5f9-1d4e-4308-bc4a-71c10a1778b6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1037.319035] env[60400]: DEBUG oslo_concurrency.lockutils [req-50be686f-8f09-49f6-9a9a-75f9ba46dc15 req-63ecc84f-6b40-4386-a160-5c9e9498e5bb service nova] Releasing lock "refresh_cache-e924a9ab-71c1-4efe-a217-b036ec785dc8" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1037.319497] env[60400]: DEBUG nova.compute.manager [req-50be686f-8f09-49f6-9a9a-75f9ba46dc15 req-63ecc84f-6b40-4386-a160-5c9e9498e5bb service nova] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Received event network-vif-deleted-21377081-ea82-47d1-a066-de059fb50c29 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1037.395625] env[60400]: DEBUG oslo_vmware.api [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Task: {'id': task-449879, 'name': ReconfigVM_Task, 'duration_secs': 0.120093} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1037.395949] env[60400]: DEBUG nova.virt.vmwareapi.volumeops [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Attached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-119131', 'volume_id': '12c96912-3d03-4b7d-9d94-8ff71c6cc5d0', 'name': 'volume-12c96912-3d03-4b7d-9d94-8ff71c6cc5d0', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '311eb356-b844-4b1b-a0f0-ed7da6bb9f1d', 'attached_at': '', 'detached_at': '', 'volume_id': '12c96912-3d03-4b7d-9d94-8ff71c6cc5d0', 'serial': '12c96912-3d03-4b7d-9d94-8ff71c6cc5d0'} {{(pid=60400) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:361}} [ 1037.396619] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.Rename_Task with opID=oslo.vmware-5336ec06-f737-4e7e-9e56-fd7100e7c19c {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1037.403107] env[60400]: DEBUG oslo_vmware.api [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Waiting for the task: (returnval){ [ 1037.403107] env[60400]: value = "task-449880" [ 1037.403107] env[60400]: _type = "Task" [ 1037.403107] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1037.412992] env[60400]: DEBUG oslo_vmware.api [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Task: {'id': task-449880, 'name': Rename_Task} progress is 5%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1037.913148] env[60400]: DEBUG oslo_vmware.api [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Task: {'id': task-449880, 'name': Rename_Task} progress is 14%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1038.413439] env[60400]: DEBUG oslo_vmware.api [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Task: {'id': task-449880, 'name': Rename_Task, 'duration_secs': 0.685408} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1038.413729] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Powering on the VM {{(pid=60400) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1442}} [ 1038.413960] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOnVM_Task with opID=oslo.vmware-22cdb783-7248-40a2-bfbb-1b914d7e5492 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1038.421211] env[60400]: DEBUG oslo_vmware.api [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Waiting for the task: (returnval){ [ 1038.421211] env[60400]: value = "task-449881" [ 1038.421211] env[60400]: _type = "Task" [ 1038.421211] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1038.431618] env[60400]: DEBUG oslo_vmware.api [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Task: {'id': task-449881, 'name': PowerOnVM_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1038.933062] env[60400]: DEBUG oslo_vmware.api [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Task: {'id': task-449881, 'name': PowerOnVM_Task, 'duration_secs': 0.428819} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1038.933062] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Powered on the VM {{(pid=60400) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1448}} [ 1038.933062] env[60400]: INFO nova.compute.manager [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Took 5.67 seconds to spawn the instance on the hypervisor. [ 1038.933062] env[60400]: DEBUG nova.compute.manager [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Checking state {{(pid=60400) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} [ 1038.933062] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1fec9bb-33af-4954-b168-6b9dcd316bae {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1038.982998] env[60400]: INFO nova.compute.manager [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Took 7.06 seconds to build instance. [ 1038.993871] env[60400]: DEBUG oslo_concurrency.lockutils [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Lock "311eb356-b844-4b1b-a0f0-ed7da6bb9f1d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 75.450s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1040.338804] env[60400]: DEBUG nova.compute.manager [req-95a312b7-aece-41fc-938d-dcc248dc495a req-d00b8085-ded3-4b43-9ba8-6c02442660b4 service nova] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Received event network-changed-07fc5d48-3095-4921-945f-cb713ab9fcdc {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1040.339200] env[60400]: DEBUG nova.compute.manager [req-95a312b7-aece-41fc-938d-dcc248dc495a req-d00b8085-ded3-4b43-9ba8-6c02442660b4 service nova] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Refreshing instance network info cache due to event network-changed-07fc5d48-3095-4921-945f-cb713ab9fcdc. {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 1040.339240] env[60400]: DEBUG oslo_concurrency.lockutils [req-95a312b7-aece-41fc-938d-dcc248dc495a req-d00b8085-ded3-4b43-9ba8-6c02442660b4 service nova] Acquiring lock "refresh_cache-311eb356-b844-4b1b-a0f0-ed7da6bb9f1d" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1040.339368] env[60400]: DEBUG oslo_concurrency.lockutils [req-95a312b7-aece-41fc-938d-dcc248dc495a req-d00b8085-ded3-4b43-9ba8-6c02442660b4 service nova] Acquired lock "refresh_cache-311eb356-b844-4b1b-a0f0-ed7da6bb9f1d" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1040.339519] env[60400]: DEBUG nova.network.neutron [req-95a312b7-aece-41fc-938d-dcc248dc495a req-d00b8085-ded3-4b43-9ba8-6c02442660b4 service nova] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Refreshing network info cache for port 07fc5d48-3095-4921-945f-cb713ab9fcdc {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} [ 1040.702148] env[60400]: DEBUG nova.network.neutron [req-95a312b7-aece-41fc-938d-dcc248dc495a req-d00b8085-ded3-4b43-9ba8-6c02442660b4 service nova] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Updated VIF entry in instance network info cache for port 07fc5d48-3095-4921-945f-cb713ab9fcdc. {{(pid=60400) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} [ 1040.702566] env[60400]: DEBUG nova.network.neutron [req-95a312b7-aece-41fc-938d-dcc248dc495a req-d00b8085-ded3-4b43-9ba8-6c02442660b4 service nova] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Updating instance_info_cache with network_info: [{"id": "07fc5d48-3095-4921-945f-cb713ab9fcdc", "address": "fa:16:3e:95:a7:75", "network": {"id": "31b4b525-456e-41c3-bf38-7b683e7f96cb", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-85131478-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "10.180.180.136", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a1fb7769ccc2463094e0dd138a59226e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "274afb4c-04df-4213-8ad2-8f48a10d78a8", "external-id": "nsx-vlan-transportzone-515", "segmentation_id": 515, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap07fc5d48-30", "ovs_interfaceid": "07fc5d48-3095-4921-945f-cb713ab9fcdc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1040.714293] env[60400]: DEBUG oslo_concurrency.lockutils [req-95a312b7-aece-41fc-938d-dcc248dc495a req-d00b8085-ded3-4b43-9ba8-6c02442660b4 service nova] Releasing lock "refresh_cache-311eb356-b844-4b1b-a0f0-ed7da6bb9f1d" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1058.068360] env[60400]: DEBUG nova.compute.manager [req-7802a1ca-3424-4038-96e8-683359df7fca req-be96deaa-096b-4585-8cb0-d0058b008310 service nova] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Received event network-vif-deleted-07fc5d48-3095-4921-945f-cb713ab9fcdc {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1063.273584] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1063.928506] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1063.933136] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1064.933866] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1064.934125] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Starting heal instance info cache {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 1064.934193] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Rebuilding the list of instances to heal {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 1064.952623] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1064.952785] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1064.952917] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1064.953055] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1064.953184] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1064.953307] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Didn't find any instances for network info cache update. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 1065.932870] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1065.950885] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1066.932585] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1067.934053] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1067.934053] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1067.934371] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60400) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 1068.932815] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1068.943616] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1068.943988] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1068.943988] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1068.944135] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60400) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1068.945252] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a982ec83-4c4c-4758-a5e6-0de7c026ef70 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1068.954229] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef3ccf45-a61a-40e3-a342-a55c9a5e24af {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1068.967593] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7526b592-746d-4714-bd8f-8086d709ac87 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1068.973659] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0df82eb-6165-438d-816a-13b223e1624f {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1069.003406] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181792MB free_disk=118GB free_vcpus=48 pci_devices=None {{(pid=60400) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1069.003559] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1069.003738] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1069.052775] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 0257c136-6f30-43ae-8f8d-e8f23d8328ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 1069.052922] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance b5ad6145-8bf0-4aed-951b-eb11dd87ed7d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 1069.053058] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance c6ee7d41-5522-4019-9da9-8503ec99e2b5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 1069.053180] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance d97a55c5-f248-482a-9986-212e84bdd0b0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 1069.053296] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance e924a9ab-71c1-4efe-a217-b036ec785dc8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 1069.053480] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Total usable vcpus: 48, total allocated vcpus: 5 {{(pid=60400) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1069.053627] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1152MB phys_disk=200GB used_disk=5GB total_vcpus=48 used_vcpus=5 pci_stats=[] {{(pid=60400) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1069.117738] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bcda54ef-1be2-4b80-9cb2-8e58f93f12b4 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1069.125128] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11baa1af-d464-4680-a1cf-da3c25449ed3 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1069.153784] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-315fc086-4c77-4987-b32a-dd2eef2e4edb {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1069.160474] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-640cb5d7-4225-4c26-a42a-435947a32781 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1069.173115] env[60400]: DEBUG nova.compute.provider_tree [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1069.181130] env[60400]: DEBUG nova.scheduler.client.report [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1069.193638] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60400) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1069.193760] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.190s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1081.593425] env[60400]: WARNING oslo_vmware.rw_handles [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1081.593425] env[60400]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1081.593425] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1081.593425] env[60400]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1081.593425] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1081.593425] env[60400]: ERROR oslo_vmware.rw_handles response.begin() [ 1081.593425] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1081.593425] env[60400]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1081.593425] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1081.593425] env[60400]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1081.593425] env[60400]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1081.593425] env[60400]: ERROR oslo_vmware.rw_handles [ 1081.593425] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Downloaded image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to vmware_temp/c8a6849e-df04-4dcd-8408-11737ef8e7c6/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1081.595200] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Caching image {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1081.595445] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Copying Virtual Disk [datastore1] vmware_temp/c8a6849e-df04-4dcd-8408-11737ef8e7c6/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk to [datastore1] vmware_temp/c8a6849e-df04-4dcd-8408-11737ef8e7c6/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk {{(pid=60400) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1081.595708] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-e231cfac-a306-41fb-a639-6586a1f5d44f {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1081.602902] env[60400]: DEBUG oslo_vmware.api [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Waiting for the task: (returnval){ [ 1081.602902] env[60400]: value = "task-449883" [ 1081.602902] env[60400]: _type = "Task" [ 1081.602902] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1081.610755] env[60400]: DEBUG oslo_vmware.api [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Task: {'id': task-449883, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1082.113563] env[60400]: DEBUG oslo_vmware.exceptions [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Fault InvalidArgument not matched. {{(pid=60400) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1082.113740] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1082.114294] env[60400]: ERROR nova.compute.manager [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1082.114294] env[60400]: Faults: ['InvalidArgument'] [ 1082.114294] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Traceback (most recent call last): [ 1082.114294] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1082.114294] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] yield resources [ 1082.114294] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1082.114294] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] self.driver.spawn(context, instance, image_meta, [ 1082.114294] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1082.114294] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1082.114294] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1082.114294] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] self._fetch_image_if_missing(context, vi) [ 1082.114294] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1082.114682] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] image_cache(vi, tmp_image_ds_loc) [ 1082.114682] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1082.114682] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] vm_util.copy_virtual_disk( [ 1082.114682] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1082.114682] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] session._wait_for_task(vmdk_copy_task) [ 1082.114682] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1082.114682] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] return self.wait_for_task(task_ref) [ 1082.114682] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1082.114682] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] return evt.wait() [ 1082.114682] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1082.114682] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] result = hub.switch() [ 1082.114682] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1082.114682] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] return self.greenlet.switch() [ 1082.115244] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1082.115244] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] self.f(*self.args, **self.kw) [ 1082.115244] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1082.115244] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] raise exceptions.translate_fault(task_info.error) [ 1082.115244] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1082.115244] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Faults: ['InvalidArgument'] [ 1082.115244] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] [ 1082.115244] env[60400]: INFO nova.compute.manager [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Terminating instance [ 1082.116240] env[60400]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1082.116501] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1082.116730] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e4f6ee3f-8cb1-4bca-aea5-246b25e1a3c0 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1082.119173] env[60400]: DEBUG nova.compute.manager [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Start destroying the instance on the hypervisor. {{(pid=60400) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1082.119368] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Destroying instance {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1082.120203] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1590c048-57b4-4dd4-8cff-098d849dbac9 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1082.126398] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Unregistering the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1082.126578] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6c987cc4-0578-4d67-9c47-57e51a260293 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1082.128608] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1082.128770] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60400) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1082.129671] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cbc14408-1b4a-40ff-9423-ad5cdac08d05 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1082.134239] env[60400]: DEBUG oslo_vmware.api [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Waiting for the task: (returnval){ [ 1082.134239] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]522e37c5-d9e3-f71a-adf6-c4ecbbb84f7f" [ 1082.134239] env[60400]: _type = "Task" [ 1082.134239] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1082.141100] env[60400]: DEBUG oslo_vmware.api [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]522e37c5-d9e3-f71a-adf6-c4ecbbb84f7f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1082.644206] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Preparing fetch location {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1082.644458] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Creating directory with path [datastore1] vmware_temp/7d00533f-cf0d-4187-a0f7-830d0932bf48/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1082.644680] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-11ba11b2-8ac4-4a30-89dc-43e91f7cf552 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1082.664678] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Created directory with path [datastore1] vmware_temp/7d00533f-cf0d-4187-a0f7-830d0932bf48/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1082.664765] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Fetch image to [datastore1] vmware_temp/7d00533f-cf0d-4187-a0f7-830d0932bf48/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1082.665410] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to [datastore1] vmware_temp/7d00533f-cf0d-4187-a0f7-830d0932bf48/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1082.665704] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6ed55cd-ac6f-4d42-ba8a-85fefc37135f {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1082.672774] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16a056b8-857a-4224-b241-c079666bd46a {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1082.681727] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-735700e1-208c-407a-8a6b-cf71db2913a6 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1082.711537] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89117ba8-7245-47be-9d98-dfc8c32c0d31 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1082.717163] env[60400]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f5f28ddc-7117-420b-ae69-5d4c09b9520f {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1082.745090] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1082.788980] env[60400]: DEBUG oslo_vmware.rw_handles [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7d00533f-cf0d-4187-a0f7-830d0932bf48/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60400) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1082.844562] env[60400]: DEBUG oslo_concurrency.lockutils [None req-c1ca1cf6-2c7e-4ee0-a6f5-1b2038693288 tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Acquiring lock "0257c136-6f30-43ae-8f8d-e8f23d8328ef" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1082.848653] env[60400]: DEBUG oslo_vmware.rw_handles [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Completed reading data from the image iterator. {{(pid=60400) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1082.848872] env[60400]: DEBUG oslo_vmware.rw_handles [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7d00533f-cf0d-4187-a0f7-830d0932bf48/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60400) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1084.975627] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Unregistered the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1084.976034] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Deleting contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1084.976094] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Deleting the datastore file [datastore1] b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88 {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1084.976315] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8ce1dc60-f20c-4d61-b206-031f0da7a2f3 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1084.982452] env[60400]: DEBUG oslo_vmware.api [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Waiting for the task: (returnval){ [ 1084.982452] env[60400]: value = "task-449885" [ 1084.982452] env[60400]: _type = "Task" [ 1084.982452] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1084.989611] env[60400]: DEBUG oslo_vmware.api [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Task: {'id': task-449885, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1085.491723] env[60400]: DEBUG oslo_vmware.api [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Task: {'id': task-449885, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074349} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1085.491969] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Deleted the datastore file {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1085.492140] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Deleted contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1085.492308] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Instance destroyed {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1085.492474] env[60400]: INFO nova.compute.manager [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Took 3.37 seconds to destroy the instance on the hypervisor. [ 1085.494999] env[60400]: DEBUG nova.compute.claims [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Aborting claim: {{(pid=60400) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1085.495174] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1085.495378] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1085.520965] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.025s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1085.521621] env[60400]: DEBUG nova.compute.utils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Instance b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88 could not be found. {{(pid=60400) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1085.522867] env[60400]: DEBUG nova.compute.manager [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Instance disappeared during build. {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1085.523043] env[60400]: DEBUG nova.compute.manager [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Unplugging VIFs for instance {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1085.523206] env[60400]: DEBUG nova.compute.manager [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1085.523369] env[60400]: DEBUG nova.compute.manager [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Deallocating network for instance {{(pid=60400) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1085.523524] env[60400]: DEBUG nova.network.neutron [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] deallocate_for_instance() {{(pid=60400) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 1085.693573] env[60400]: DEBUG neutronclient.v2_0.client [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60400) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1085.694919] env[60400]: ERROR nova.compute.manager [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1085.694919] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Traceback (most recent call last): [ 1085.694919] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1085.694919] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] self.driver.spawn(context, instance, image_meta, [ 1085.694919] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1085.694919] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1085.694919] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1085.694919] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] self._fetch_image_if_missing(context, vi) [ 1085.694919] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1085.694919] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] image_cache(vi, tmp_image_ds_loc) [ 1085.694919] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1085.694919] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] vm_util.copy_virtual_disk( [ 1085.695272] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1085.695272] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] session._wait_for_task(vmdk_copy_task) [ 1085.695272] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1085.695272] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] return self.wait_for_task(task_ref) [ 1085.695272] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1085.695272] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] return evt.wait() [ 1085.695272] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1085.695272] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] result = hub.switch() [ 1085.695272] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1085.695272] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] return self.greenlet.switch() [ 1085.695272] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1085.695272] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] self.f(*self.args, **self.kw) [ 1085.695272] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1085.695624] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] raise exceptions.translate_fault(task_info.error) [ 1085.695624] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1085.695624] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Faults: ['InvalidArgument'] [ 1085.695624] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] [ 1085.695624] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] During handling of the above exception, another exception occurred: [ 1085.695624] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] [ 1085.695624] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Traceback (most recent call last): [ 1085.695624] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1085.695624] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] self._build_and_run_instance(context, instance, image, [ 1085.695624] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1085.695624] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] with excutils.save_and_reraise_exception(): [ 1085.695624] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1085.695624] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] self.force_reraise() [ 1085.695624] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1085.696050] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] raise self.value [ 1085.696050] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1085.696050] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] with self.rt.instance_claim(context, instance, node, allocs, [ 1085.696050] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1085.696050] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] self.abort() [ 1085.696050] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/compute/claims.py", line 85, in abort [ 1085.696050] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] self.tracker.abort_instance_claim(self.context, self.instance, [ 1085.696050] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1085.696050] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] return f(*args, **kwargs) [ 1085.696050] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1085.696050] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] self._unset_instance_host_and_node(instance) [ 1085.696050] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1085.696050] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] instance.save() [ 1085.696490] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1085.696490] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] updates, result = self.indirection_api.object_action( [ 1085.696490] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1085.696490] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] return cctxt.call(context, 'object_action', objinst=objinst, [ 1085.696490] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1085.696490] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] result = self.transport._send( [ 1085.696490] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1085.696490] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] return self._driver.send(target, ctxt, message, [ 1085.696490] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1085.696490] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1085.696490] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1085.696490] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] raise result [ 1085.696848] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] nova.exception_Remote.InstanceNotFound_Remote: Instance b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88 could not be found. [ 1085.696848] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Traceback (most recent call last): [ 1085.696848] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] [ 1085.696848] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1085.696848] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] return getattr(target, method)(*args, **kwargs) [ 1085.696848] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] [ 1085.696848] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1085.696848] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] return fn(self, *args, **kwargs) [ 1085.696848] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] [ 1085.696848] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1085.696848] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] old_ref, inst_ref = db.instance_update_and_get_original( [ 1085.696848] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] [ 1085.696848] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1085.696848] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] return f(*args, **kwargs) [ 1085.696848] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] [ 1085.697271] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1085.697271] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] with excutils.save_and_reraise_exception() as ectxt: [ 1085.697271] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] [ 1085.697271] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1085.697271] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] self.force_reraise() [ 1085.697271] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] [ 1085.697271] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1085.697271] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] raise self.value [ 1085.697271] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] [ 1085.697271] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1085.697271] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] return f(*args, **kwargs) [ 1085.697271] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] [ 1085.697271] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1085.697271] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] return f(context, *args, **kwargs) [ 1085.697271] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] [ 1085.697694] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1085.697694] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1085.697694] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] [ 1085.697694] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1085.697694] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] raise exception.InstanceNotFound(instance_id=uuid) [ 1085.697694] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] [ 1085.697694] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] nova.exception.InstanceNotFound: Instance b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88 could not be found. [ 1085.697694] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] [ 1085.697694] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] [ 1085.697694] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] During handling of the above exception, another exception occurred: [ 1085.697694] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] [ 1085.697694] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Traceback (most recent call last): [ 1085.697694] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1085.697694] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] ret = obj(*args, **kwargs) [ 1085.697694] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1085.698172] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] exception_handler_v20(status_code, error_body) [ 1085.698172] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1085.698172] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] raise client_exc(message=error_message, [ 1085.698172] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1085.698172] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Neutron server returns request_ids: ['req-5067eb60-3736-4953-86bc-8cde416f1bb7'] [ 1085.698172] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] [ 1085.698172] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] During handling of the above exception, another exception occurred: [ 1085.698172] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] [ 1085.698172] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Traceback (most recent call last): [ 1085.698172] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1085.698172] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] self._deallocate_network(context, instance, requested_networks) [ 1085.698172] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1085.698172] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] self.network_api.deallocate_for_instance( [ 1085.698535] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/network/neutron.py", line 1798, in deallocate_for_instance [ 1085.698535] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] data = neutron.list_ports(**search_opts) [ 1085.698535] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1085.698535] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] ret = obj(*args, **kwargs) [ 1085.698535] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1085.698535] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] return self.list('ports', self.ports_path, retrieve_all, [ 1085.698535] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1085.698535] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] ret = obj(*args, **kwargs) [ 1085.698535] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1085.698535] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] for r in self._pagination(collection, path, **params): [ 1085.698535] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1085.698535] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] res = self.get(path, params=params) [ 1085.698535] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1085.698897] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] ret = obj(*args, **kwargs) [ 1085.698897] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1085.698897] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] return self.retry_request("GET", action, body=body, [ 1085.698897] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1085.698897] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] ret = obj(*args, **kwargs) [ 1085.698897] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1085.698897] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] return self.do_request(method, action, body=body, [ 1085.698897] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1085.698897] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] ret = obj(*args, **kwargs) [ 1085.698897] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1085.698897] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] self._handle_fault_response(status_code, replybody, resp) [ 1085.698897] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1085.698897] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] raise exception.Unauthorized() [ 1085.699233] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] nova.exception.Unauthorized: Not authorized. [ 1085.699233] env[60400]: ERROR nova.compute.manager [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] [ 1085.716252] env[60400]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Lock "b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 350.059s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1124.194595] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1124.929247] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1124.932681] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1125.934465] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1125.934792] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Starting heal instance info cache {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 1125.934792] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Rebuilding the list of instances to heal {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 1125.950203] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1125.950351] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1125.950480] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1125.950601] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1125.950747] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1125.950882] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Didn't find any instances for network info cache update. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 1125.951681] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1127.933848] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1127.934243] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1128.933550] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1128.933550] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60400) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 1130.416055] env[60400]: WARNING oslo_vmware.rw_handles [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1130.416055] env[60400]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1130.416055] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1130.416055] env[60400]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1130.416055] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1130.416055] env[60400]: ERROR oslo_vmware.rw_handles response.begin() [ 1130.416055] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1130.416055] env[60400]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1130.416055] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1130.416055] env[60400]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1130.416055] env[60400]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1130.416055] env[60400]: ERROR oslo_vmware.rw_handles [ 1130.416055] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Downloaded image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to vmware_temp/7d00533f-cf0d-4187-a0f7-830d0932bf48/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1130.417727] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Caching image {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1130.417990] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Copying Virtual Disk [datastore1] vmware_temp/7d00533f-cf0d-4187-a0f7-830d0932bf48/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk to [datastore1] vmware_temp/7d00533f-cf0d-4187-a0f7-830d0932bf48/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk {{(pid=60400) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1130.418315] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-70819198-5aec-452b-8742-f33943cd9cda {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1130.427307] env[60400]: DEBUG oslo_vmware.api [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Waiting for the task: (returnval){ [ 1130.427307] env[60400]: value = "task-449886" [ 1130.427307] env[60400]: _type = "Task" [ 1130.427307] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1130.435354] env[60400]: DEBUG oslo_vmware.api [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Task: {'id': task-449886, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1130.932287] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1130.937593] env[60400]: DEBUG oslo_vmware.exceptions [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Fault InvalidArgument not matched. {{(pid=60400) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1130.937823] env[60400]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1130.938369] env[60400]: ERROR nova.compute.manager [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1130.938369] env[60400]: Faults: ['InvalidArgument'] [ 1130.938369] env[60400]: ERROR nova.compute.manager [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Traceback (most recent call last): [ 1130.938369] env[60400]: ERROR nova.compute.manager [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1130.938369] env[60400]: ERROR nova.compute.manager [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] yield resources [ 1130.938369] env[60400]: ERROR nova.compute.manager [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1130.938369] env[60400]: ERROR nova.compute.manager [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] self.driver.spawn(context, instance, image_meta, [ 1130.938369] env[60400]: ERROR nova.compute.manager [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1130.938369] env[60400]: ERROR nova.compute.manager [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1130.938369] env[60400]: ERROR nova.compute.manager [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1130.938369] env[60400]: ERROR nova.compute.manager [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] self._fetch_image_if_missing(context, vi) [ 1130.938369] env[60400]: ERROR nova.compute.manager [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1130.938746] env[60400]: ERROR nova.compute.manager [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] image_cache(vi, tmp_image_ds_loc) [ 1130.938746] env[60400]: ERROR nova.compute.manager [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1130.938746] env[60400]: ERROR nova.compute.manager [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] vm_util.copy_virtual_disk( [ 1130.938746] env[60400]: ERROR nova.compute.manager [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1130.938746] env[60400]: ERROR nova.compute.manager [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] session._wait_for_task(vmdk_copy_task) [ 1130.938746] env[60400]: ERROR nova.compute.manager [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1130.938746] env[60400]: ERROR nova.compute.manager [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] return self.wait_for_task(task_ref) [ 1130.938746] env[60400]: ERROR nova.compute.manager [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1130.938746] env[60400]: ERROR nova.compute.manager [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] return evt.wait() [ 1130.938746] env[60400]: ERROR nova.compute.manager [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1130.938746] env[60400]: ERROR nova.compute.manager [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] result = hub.switch() [ 1130.938746] env[60400]: ERROR nova.compute.manager [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1130.938746] env[60400]: ERROR nova.compute.manager [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] return self.greenlet.switch() [ 1130.939192] env[60400]: ERROR nova.compute.manager [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1130.939192] env[60400]: ERROR nova.compute.manager [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] self.f(*self.args, **self.kw) [ 1130.939192] env[60400]: ERROR nova.compute.manager [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1130.939192] env[60400]: ERROR nova.compute.manager [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] raise exceptions.translate_fault(task_info.error) [ 1130.939192] env[60400]: ERROR nova.compute.manager [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1130.939192] env[60400]: ERROR nova.compute.manager [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Faults: ['InvalidArgument'] [ 1130.939192] env[60400]: ERROR nova.compute.manager [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] [ 1130.939192] env[60400]: INFO nova.compute.manager [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Terminating instance [ 1130.940492] env[60400]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1130.941249] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1130.942010] env[60400]: DEBUG nova.compute.manager [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Start destroying the instance on the hypervisor. {{(pid=60400) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1130.942217] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Destroying instance {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1130.943087] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2ebf686b-5407-4468-a10f-5ee8e396687e {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1130.945436] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ad3afd9-a94f-416f-9831-b7102d36543e {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1130.948672] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1130.948878] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1130.949042] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1130.949191] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60400) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1130.950127] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41aead00-2747-4d96-80c5-d39867f23bf1 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1130.958061] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Unregistering the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1130.960693] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0c9ccce4-48ec-408d-824c-9407dc064d19 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1130.962061] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1130.962228] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60400) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1130.963736] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b6b3c5b-999d-4b49-9176-93227ce50806 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1130.967324] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a329b373-6788-43d6-bd8f-e3ade9808d5e {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1130.973271] env[60400]: DEBUG oslo_vmware.api [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Waiting for the task: (returnval){ [ 1130.973271] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52e0531b-81dd-a531-704c-7d1422753bd6" [ 1130.973271] env[60400]: _type = "Task" [ 1130.973271] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1130.983361] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c777d88-7c83-4d42-a42b-a8c27a8f588b {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1130.991598] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Preparing fetch location {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1130.991816] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Creating directory with path [datastore1] vmware_temp/7508f0c8-f7ed-4e0b-a432-04dde616bec1/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1130.992055] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-87ef1630-de55-4048-818d-81e712bd58fc {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1130.994239] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54846125-821f-46df-ac92-a402bacfded5 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1131.023885] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181592MB free_disk=118GB free_vcpus=48 pci_devices=None {{(pid=60400) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1131.024080] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1131.024229] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1131.027519] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Created directory with path [datastore1] vmware_temp/7508f0c8-f7ed-4e0b-a432-04dde616bec1/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1131.027702] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Fetch image to [datastore1] vmware_temp/7508f0c8-f7ed-4e0b-a432-04dde616bec1/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1131.027870] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to [datastore1] vmware_temp/7508f0c8-f7ed-4e0b-a432-04dde616bec1/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1131.028621] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04083c4c-964c-4634-9d2c-1835918a3095 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1131.032159] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Unregistered the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1131.032354] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Deleting contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1131.032516] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Deleting the datastore file [datastore1] 19881c50-a8ff-411f-b570-d4dc9ef3b0dc {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1131.033076] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e76f9f86-d83c-4c72-b535-1d6c48b4023d {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1131.037500] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e125c08-2d9c-449f-aca3-d6aa76ce59e8 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1131.040591] env[60400]: DEBUG oslo_vmware.api [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Waiting for the task: (returnval){ [ 1131.040591] env[60400]: value = "task-449888" [ 1131.040591] env[60400]: _type = "Task" [ 1131.040591] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1131.050944] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92ca986c-a806-4309-b59c-16ca07a9d8cc {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1131.060952] env[60400]: DEBUG oslo_vmware.api [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Task: {'id': task-449888, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1131.093186] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a43d2be5-7e1d-4750-9e60-3d7973f00d07 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1131.099353] env[60400]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0229a783-8e98-4b35-a0e7-179c3c223219 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1131.109081] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 0257c136-6f30-43ae-8f8d-e8f23d8328ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 1131.109194] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance b5ad6145-8bf0-4aed-951b-eb11dd87ed7d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 1131.109312] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance c6ee7d41-5522-4019-9da9-8503ec99e2b5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 1131.109443] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance d97a55c5-f248-482a-9986-212e84bdd0b0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 1131.109590] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance e924a9ab-71c1-4efe-a217-b036ec785dc8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 1131.109902] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Total usable vcpus: 48, total allocated vcpus: 5 {{(pid=60400) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1131.109902] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1152MB phys_disk=200GB used_disk=5GB total_vcpus=48 used_vcpus=5 pci_stats=[] {{(pid=60400) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1131.119349] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1131.163301] env[60400]: DEBUG oslo_vmware.rw_handles [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7508f0c8-f7ed-4e0b-a432-04dde616bec1/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60400) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1131.217479] env[60400]: DEBUG oslo_vmware.rw_handles [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Completed reading data from the image iterator. {{(pid=60400) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1131.217657] env[60400]: DEBUG oslo_vmware.rw_handles [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7508f0c8-f7ed-4e0b-a432-04dde616bec1/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60400) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1131.224800] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb310a22-4f97-4d22-a7db-46a26438e1ce {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1131.232657] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42ad9435-eb2e-4874-9c34-cdcb1093fc8a {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1131.262203] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d8f5a6b-0be7-4687-8028-0538832fc80b {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1131.268965] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2b87168-0457-401b-b2c5-6d4fbb3a62ed {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1131.282732] env[60400]: DEBUG nova.compute.provider_tree [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1131.291325] env[60400]: DEBUG nova.scheduler.client.report [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1131.304426] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60400) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1131.304585] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.280s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1131.551858] env[60400]: DEBUG oslo_vmware.api [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Task: {'id': task-449888, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.062724} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1131.552219] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Deleted the datastore file {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1131.552335] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Deleted contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1131.552420] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Instance destroyed {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1131.552586] env[60400]: INFO nova.compute.manager [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1131.554702] env[60400]: DEBUG nova.compute.claims [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Aborting claim: {{(pid=60400) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1131.554866] env[60400]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1131.555081] env[60400]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1131.579720] env[60400]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.024s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1131.580450] env[60400]: DEBUG nova.compute.utils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Instance 19881c50-a8ff-411f-b570-d4dc9ef3b0dc could not be found. {{(pid=60400) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1131.581933] env[60400]: DEBUG nova.compute.manager [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Instance disappeared during build. {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1131.582111] env[60400]: DEBUG nova.compute.manager [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Unplugging VIFs for instance {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1131.582268] env[60400]: DEBUG nova.compute.manager [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1131.582427] env[60400]: DEBUG nova.compute.manager [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Deallocating network for instance {{(pid=60400) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1131.582578] env[60400]: DEBUG nova.network.neutron [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] deallocate_for_instance() {{(pid=60400) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 1131.663780] env[60400]: DEBUG nova.network.neutron [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Updating instance_info_cache with network_info: [] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1131.673047] env[60400]: INFO nova.compute.manager [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Took 0.09 seconds to deallocate network for instance. [ 1131.710900] env[60400]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Lock "19881c50-a8ff-411f-b570-d4dc9ef3b0dc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 292.089s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1144.580523] env[60400]: DEBUG nova.compute.manager [req-d3473e67-36e8-4857-9b26-b4d5a4dbe985 req-9259e989-e241-409c-b0fd-4797034edf26 service nova] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Received event network-vif-deleted-64fa08d6-5cd6-4437-b6ca-08257e3f0696 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1176.622873] env[60400]: WARNING oslo_vmware.rw_handles [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1176.622873] env[60400]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1176.622873] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1176.622873] env[60400]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1176.622873] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1176.622873] env[60400]: ERROR oslo_vmware.rw_handles response.begin() [ 1176.622873] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1176.622873] env[60400]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1176.622873] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1176.622873] env[60400]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1176.622873] env[60400]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1176.622873] env[60400]: ERROR oslo_vmware.rw_handles [ 1176.623622] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Downloaded image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to vmware_temp/7508f0c8-f7ed-4e0b-a432-04dde616bec1/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1176.624996] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Caching image {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1176.625260] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Copying Virtual Disk [datastore1] vmware_temp/7508f0c8-f7ed-4e0b-a432-04dde616bec1/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk to [datastore1] vmware_temp/7508f0c8-f7ed-4e0b-a432-04dde616bec1/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk {{(pid=60400) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1176.625564] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-53ebfb88-8545-464f-bad1-01cecb0b7b8a {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1176.633845] env[60400]: DEBUG oslo_vmware.api [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Waiting for the task: (returnval){ [ 1176.633845] env[60400]: value = "task-449889" [ 1176.633845] env[60400]: _type = "Task" [ 1176.633845] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1176.642112] env[60400]: DEBUG oslo_vmware.api [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Task: {'id': task-449889, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1177.143700] env[60400]: DEBUG oslo_vmware.exceptions [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Fault InvalidArgument not matched. {{(pid=60400) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1177.143945] env[60400]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1177.144536] env[60400]: ERROR nova.compute.manager [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1177.144536] env[60400]: Faults: ['InvalidArgument'] [ 1177.144536] env[60400]: ERROR nova.compute.manager [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Traceback (most recent call last): [ 1177.144536] env[60400]: ERROR nova.compute.manager [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1177.144536] env[60400]: ERROR nova.compute.manager [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] yield resources [ 1177.144536] env[60400]: ERROR nova.compute.manager [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1177.144536] env[60400]: ERROR nova.compute.manager [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] self.driver.spawn(context, instance, image_meta, [ 1177.144536] env[60400]: ERROR nova.compute.manager [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1177.144536] env[60400]: ERROR nova.compute.manager [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1177.144536] env[60400]: ERROR nova.compute.manager [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1177.144536] env[60400]: ERROR nova.compute.manager [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] self._fetch_image_if_missing(context, vi) [ 1177.144536] env[60400]: ERROR nova.compute.manager [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1177.144887] env[60400]: ERROR nova.compute.manager [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] image_cache(vi, tmp_image_ds_loc) [ 1177.144887] env[60400]: ERROR nova.compute.manager [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1177.144887] env[60400]: ERROR nova.compute.manager [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] vm_util.copy_virtual_disk( [ 1177.144887] env[60400]: ERROR nova.compute.manager [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1177.144887] env[60400]: ERROR nova.compute.manager [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] session._wait_for_task(vmdk_copy_task) [ 1177.144887] env[60400]: ERROR nova.compute.manager [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1177.144887] env[60400]: ERROR nova.compute.manager [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] return self.wait_for_task(task_ref) [ 1177.144887] env[60400]: ERROR nova.compute.manager [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1177.144887] env[60400]: ERROR nova.compute.manager [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] return evt.wait() [ 1177.144887] env[60400]: ERROR nova.compute.manager [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1177.144887] env[60400]: ERROR nova.compute.manager [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] result = hub.switch() [ 1177.144887] env[60400]: ERROR nova.compute.manager [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1177.144887] env[60400]: ERROR nova.compute.manager [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] return self.greenlet.switch() [ 1177.145230] env[60400]: ERROR nova.compute.manager [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1177.145230] env[60400]: ERROR nova.compute.manager [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] self.f(*self.args, **self.kw) [ 1177.145230] env[60400]: ERROR nova.compute.manager [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1177.145230] env[60400]: ERROR nova.compute.manager [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] raise exceptions.translate_fault(task_info.error) [ 1177.145230] env[60400]: ERROR nova.compute.manager [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1177.145230] env[60400]: ERROR nova.compute.manager [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Faults: ['InvalidArgument'] [ 1177.145230] env[60400]: ERROR nova.compute.manager [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] [ 1177.145230] env[60400]: INFO nova.compute.manager [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Terminating instance [ 1177.146427] env[60400]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1177.146628] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1177.146853] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-27d49151-47ab-4a2a-87d7-8cf964cd7fc3 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1177.149177] env[60400]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquiring lock "refresh_cache-c6ee7d41-5522-4019-9da9-8503ec99e2b5" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1177.149324] env[60400]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquired lock "refresh_cache-c6ee7d41-5522-4019-9da9-8503ec99e2b5" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1177.149481] env[60400]: DEBUG nova.network.neutron [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Building network info cache for instance {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} [ 1177.155887] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1177.156065] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60400) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1177.156745] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5062b4de-fef6-4bf2-990a-e5a3c1704512 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1177.159504] env[60400]: DEBUG nova.compute.utils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Can not refresh info_cache because instance was not found {{(pid=60400) refresh_info_cache_for_instance /opt/stack/nova/nova/compute/utils.py:1010}} [ 1177.164219] env[60400]: DEBUG oslo_vmware.api [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Waiting for the task: (returnval){ [ 1177.164219] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52b6bc5f-8ca0-f013-31c1-192f366d36f5" [ 1177.164219] env[60400]: _type = "Task" [ 1177.164219] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1177.171445] env[60400]: DEBUG oslo_vmware.api [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52b6bc5f-8ca0-f013-31c1-192f366d36f5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1177.184729] env[60400]: DEBUG nova.network.neutron [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Instance cache missing network info. {{(pid=60400) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 1177.262010] env[60400]: DEBUG nova.network.neutron [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Updating instance_info_cache with network_info: [] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1177.271658] env[60400]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Releasing lock "refresh_cache-c6ee7d41-5522-4019-9da9-8503ec99e2b5" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1177.272099] env[60400]: DEBUG nova.compute.manager [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Start destroying the instance on the hypervisor. {{(pid=60400) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1177.272290] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Destroying instance {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1177.273314] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab062675-f044-430f-95a1-a0b9a3f8ee51 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1177.282252] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Unregistering the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1177.282457] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-139f2506-3ade-4aab-89aa-386cc2640992 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1177.318326] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Unregistered the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1177.318522] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Deleting contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1177.318691] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Deleting the datastore file [datastore1] c6ee7d41-5522-4019-9da9-8503ec99e2b5 {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1177.318921] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c3e6d1bf-2170-47ee-8859-3070934fbbe1 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1177.324302] env[60400]: DEBUG oslo_vmware.api [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Waiting for the task: (returnval){ [ 1177.324302] env[60400]: value = "task-449891" [ 1177.324302] env[60400]: _type = "Task" [ 1177.324302] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1177.331532] env[60400]: DEBUG oslo_vmware.api [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Task: {'id': task-449891, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1177.674968] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Preparing fetch location {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1177.675326] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Creating directory with path [datastore1] vmware_temp/b5b17d8d-e9b9-46b0-be2c-3036fc17384f/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1177.675517] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f04e0cd5-5d21-497d-9af2-1ca9f6ca6038 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1177.686934] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Created directory with path [datastore1] vmware_temp/b5b17d8d-e9b9-46b0-be2c-3036fc17384f/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1177.687162] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Fetch image to [datastore1] vmware_temp/b5b17d8d-e9b9-46b0-be2c-3036fc17384f/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1177.687339] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to [datastore1] vmware_temp/b5b17d8d-e9b9-46b0-be2c-3036fc17384f/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1177.688034] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b6e47f5-812b-410f-b324-4d74973a1ca7 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1177.694461] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef43576d-9fea-4769-9d27-65b6392a52e3 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1177.703226] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-085723a7-91e2-4f3e-93df-b41f1ebc2933 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1177.733828] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3fd1f802-e134-4908-b7d5-e6c39c277d8c {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1177.739407] env[60400]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-640f8c18-4838-4ede-96b3-dc0914be289f {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1177.760194] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1177.803240] env[60400]: DEBUG oslo_vmware.rw_handles [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b5b17d8d-e9b9-46b0-be2c-3036fc17384f/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60400) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1177.858743] env[60400]: DEBUG oslo_vmware.rw_handles [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Completed reading data from the image iterator. {{(pid=60400) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1177.858906] env[60400]: DEBUG oslo_vmware.rw_handles [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b5b17d8d-e9b9-46b0-be2c-3036fc17384f/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60400) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1177.862577] env[60400]: DEBUG oslo_vmware.api [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Task: {'id': task-449891, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.031571} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1177.862795] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Deleted the datastore file {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1177.862997] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Deleted contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1177.863187] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Instance destroyed {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1177.863355] env[60400]: INFO nova.compute.manager [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Took 0.59 seconds to destroy the instance on the hypervisor. [ 1177.863572] env[60400]: DEBUG oslo.service.loopingcall [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60400) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1177.863768] env[60400]: DEBUG nova.compute.manager [-] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Skipping network deallocation for instance since networking was not requested. {{(pid=60400) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 1177.866081] env[60400]: DEBUG nova.compute.claims [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Aborting claim: {{(pid=60400) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1177.866254] env[60400]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1177.866461] env[60400]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1177.891524] env[60400]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.025s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1177.892186] env[60400]: DEBUG nova.compute.utils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Instance c6ee7d41-5522-4019-9da9-8503ec99e2b5 could not be found. {{(pid=60400) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1177.893468] env[60400]: DEBUG nova.compute.manager [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Instance disappeared during build. {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1177.893629] env[60400]: DEBUG nova.compute.manager [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Unplugging VIFs for instance {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1177.893830] env[60400]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquiring lock "refresh_cache-c6ee7d41-5522-4019-9da9-8503ec99e2b5" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1177.893968] env[60400]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquired lock "refresh_cache-c6ee7d41-5522-4019-9da9-8503ec99e2b5" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1177.894129] env[60400]: DEBUG nova.network.neutron [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Building network info cache for instance {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} [ 1177.901324] env[60400]: DEBUG nova.compute.utils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Can not refresh info_cache because instance was not found {{(pid=60400) refresh_info_cache_for_instance /opt/stack/nova/nova/compute/utils.py:1010}} [ 1177.918567] env[60400]: DEBUG nova.network.neutron [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Instance cache missing network info. {{(pid=60400) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 1177.975829] env[60400]: DEBUG nova.network.neutron [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Updating instance_info_cache with network_info: [] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1177.984427] env[60400]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Releasing lock "refresh_cache-c6ee7d41-5522-4019-9da9-8503ec99e2b5" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1177.984639] env[60400]: DEBUG nova.compute.manager [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1177.984902] env[60400]: DEBUG nova.compute.manager [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Skipping network deallocation for instance since networking was not requested. {{(pid=60400) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 1178.024203] env[60400]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "c6ee7d41-5522-4019-9da9-8503ec99e2b5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 226.385s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1178.975959] env[60400]: DEBUG nova.compute.manager [req-8ac5f179-c34b-4be3-a3de-46076a311ba0 req-37ec723a-2ed3-46e2-aec6-8a4615597f8d service nova] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Received event network-vif-deleted-312be5f9-1d4e-4308-bc4a-71c10a1778b6 {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1182.933591] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1182.933927] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Cleaning up deleted instances with incomplete migration {{(pid=60400) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11133}} [ 1183.946740] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1183.946740] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1183.946740] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Cleaning up deleted instances {{(pid=60400) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11095}} [ 1184.003480] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] There are 14 instances to clean {{(pid=60400) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11104}} [ 1184.003765] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Instance has had 0 of 5 cleanup attempts {{(pid=60400) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1184.041822] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Instance has had 0 of 5 cleanup attempts {{(pid=60400) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1184.087170] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Instance has had 0 of 5 cleanup attempts {{(pid=60400) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1184.110829] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Instance has had 0 of 5 cleanup attempts {{(pid=60400) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1184.137890] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Instance has had 0 of 5 cleanup attempts {{(pid=60400) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1184.158769] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Instance has had 0 of 5 cleanup attempts {{(pid=60400) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1184.184273] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Instance has had 0 of 5 cleanup attempts {{(pid=60400) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1184.203862] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Instance has had 0 of 5 cleanup attempts {{(pid=60400) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1184.224344] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Instance has had 0 of 5 cleanup attempts {{(pid=60400) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1184.244531] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Instance has had 0 of 5 cleanup attempts {{(pid=60400) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1184.263160] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Instance has had 0 of 5 cleanup attempts {{(pid=60400) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1184.282141] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Instance has had 0 of 5 cleanup attempts {{(pid=60400) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1184.300572] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Instance has had 0 of 5 cleanup attempts {{(pid=60400) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1184.319178] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Instance has had 0 of 5 cleanup attempts {{(pid=60400) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1185.328404] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1185.928439] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1186.933053] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1186.933053] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1187.104981] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._sync_power_states {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1187.115785] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Getting list of instances from cluster (obj){ [ 1187.115785] env[60400]: value = "domain-c8" [ 1187.115785] env[60400]: _type = "ClusterComputeResource" [ 1187.115785] env[60400]: } {{(pid=60400) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1187.116706] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e927cf81-6b22-4fc8-b9d6-6b8bf7463340 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1187.130068] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Got total of 5 instances {{(pid=60400) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1187.130227] env[60400]: WARNING nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] While synchronizing instance power states, found 1 instances in the database and 5 instances on the hypervisor. [ 1187.130362] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Triggering sync for uuid 0257c136-6f30-43ae-8f8d-e8f23d8328ef {{(pid=60400) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10218}} [ 1187.130680] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Acquiring lock "0257c136-6f30-43ae-8f8d-e8f23d8328ef" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1187.954196] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1187.970561] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1187.970764] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Starting heal instance info cache {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 1187.970899] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Rebuilding the list of instances to heal {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 1187.980716] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1187.980716] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Didn't find any instances for network info cache update. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 1187.980716] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1188.932693] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1188.932899] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60400) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 1189.936235] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1192.379422] env[60400]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Acquiring lock "f114d70b-3524-4f1c-b1af-71ae3235d040" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1192.379760] env[60400]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Lock "f114d70b-3524-4f1c-b1af-71ae3235d040" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1192.388389] env[60400]: DEBUG nova.compute.manager [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Starting instance... {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1192.493655] env[60400]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1192.493895] env[60400]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1192.495401] env[60400]: INFO nova.compute.claims [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1192.552870] env[60400]: DEBUG nova.scheduler.client.report [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Refreshing inventories for resource provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1192.565996] env[60400]: DEBUG nova.scheduler.client.report [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Updating ProviderTree inventory for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1192.566212] env[60400]: DEBUG nova.compute.provider_tree [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Updating inventory in ProviderTree for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1192.575679] env[60400]: DEBUG nova.scheduler.client.report [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Refreshing aggregate associations for resource provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc, aggregates: None {{(pid=60400) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1192.590548] env[60400]: DEBUG nova.scheduler.client.report [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Refreshing trait associations for resource provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO {{(pid=60400) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1192.625166] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45a22561-e63c-4a42-a811-5d2c395639e1 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.632724] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a51bc681-e6de-4acb-a5b9-39016e02d6ec {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.663213] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c994b44-6be1-4b6d-a23a-003aad567cae {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.670734] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb3a8006-3a34-4aa6-9c1b-435043d0a0d8 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.683525] env[60400]: DEBUG nova.compute.provider_tree [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1192.693077] env[60400]: DEBUG nova.scheduler.client.report [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1192.710606] env[60400]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.215s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1192.710606] env[60400]: DEBUG nova.compute.manager [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Start building networks asynchronously for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1192.741343] env[60400]: DEBUG nova.compute.utils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Using /dev/sd instead of None {{(pid=60400) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1192.742725] env[60400]: DEBUG nova.compute.manager [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Allocating IP information in the background. {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1192.742894] env[60400]: DEBUG nova.network.neutron [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] allocate_for_instance() {{(pid=60400) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1192.752178] env[60400]: DEBUG nova.compute.manager [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Start building block device mappings for instance. {{(pid=60400) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1192.799332] env[60400]: DEBUG nova.policy [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '101ef53aa5c0412b8a7cd0abe6761419', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '07237bcd8b47450cae1f09b3c693038e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60400) authorize /opt/stack/nova/nova/policy.py:203}} [ 1192.813276] env[60400]: DEBUG nova.compute.manager [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Start spawning the instance on the hypervisor. {{(pid=60400) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1192.833792] env[60400]: DEBUG nova.virt.hardware [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T04:32:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T04:32:17Z,direct_url=,disk_format='vmdk',id=f5dfd970-7a56-4489-873c-2c3b6fbd9fe9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c82f07917ba4819a6bcf09e15f9f9cf',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T04:32:18Z,virtual_size=,visibility=), allow threads: False {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} [ 1192.834030] env[60400]: DEBUG nova.virt.hardware [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Flavor limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} [ 1192.834183] env[60400]: DEBUG nova.virt.hardware [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Image limits 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} [ 1192.834358] env[60400]: DEBUG nova.virt.hardware [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Flavor pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} [ 1192.834495] env[60400]: DEBUG nova.virt.hardware [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Image pref 0:0:0 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} [ 1192.834632] env[60400]: DEBUG nova.virt.hardware [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60400) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} [ 1192.834832] env[60400]: DEBUG nova.virt.hardware [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} [ 1192.834984] env[60400]: DEBUG nova.virt.hardware [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} [ 1192.835157] env[60400]: DEBUG nova.virt.hardware [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Got 1 possible topologies {{(pid=60400) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} [ 1192.835314] env[60400]: DEBUG nova.virt.hardware [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} [ 1192.835479] env[60400]: DEBUG nova.virt.hardware [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60400) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} [ 1192.836359] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45bcc1dd-e7e1-45ed-9046-bbe044369b0e {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.844068] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d15d287-6598-46be-93d4-790482ca2f53 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.933288] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1192.943791] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1192.944029] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1192.944196] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1192.944352] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60400) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1192.945448] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92b81a76-289e-4c34-888c-44ac6c4d0dc9 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.954306] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c49171a6-0c60-4be9-a317-a05b6d76be0f {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.969801] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40395e69-b637-481d-9554-b571eaf76b6e {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.975939] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-038eb02c-dc0d-4439-bafd-181cbca8a672 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.004940] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181639MB free_disk=118GB free_vcpus=48 pci_devices=None {{(pid=60400) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1193.005100] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1193.005281] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1193.049484] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 0257c136-6f30-43ae-8f8d-e8f23d8328ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 1193.049639] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance f114d70b-3524-4f1c-b1af-71ae3235d040 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 1193.049871] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Total usable vcpus: 48, total allocated vcpus: 2 {{(pid=60400) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1193.050021] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=768MB phys_disk=200GB used_disk=2GB total_vcpus=48 used_vcpus=2 pci_stats=[] {{(pid=60400) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1193.079400] env[60400]: DEBUG nova.network.neutron [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Successfully created port: de9c125c-8971-4a4d-afe3-0b364b13d48d {{(pid=60400) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1193.097868] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c89a2e4a-1209-4a0e-8fda-8129306a2a68 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.105557] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46250cc5-c17f-4f48-8ba9-82194f5158ae {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.139155] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71f2b704-b01e-4656-9bc8-6c5cf0e9c551 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.146535] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c15e02f3-2b80-4e9d-b61a-66592db3d912 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.164270] env[60400]: DEBUG nova.compute.provider_tree [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1193.175059] env[60400]: DEBUG nova.scheduler.client.report [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1193.188347] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60400) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1193.188526] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.183s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1193.622855] env[60400]: DEBUG nova.compute.manager [req-fdd6de81-6663-475f-bdbb-de2e463681df req-9358cb81-17cf-4d87-bc97-16ee00476739 service nova] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Received event network-vif-plugged-de9c125c-8971-4a4d-afe3-0b364b13d48d {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1193.623118] env[60400]: DEBUG oslo_concurrency.lockutils [req-fdd6de81-6663-475f-bdbb-de2e463681df req-9358cb81-17cf-4d87-bc97-16ee00476739 service nova] Acquiring lock "f114d70b-3524-4f1c-b1af-71ae3235d040-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1193.623279] env[60400]: DEBUG oslo_concurrency.lockutils [req-fdd6de81-6663-475f-bdbb-de2e463681df req-9358cb81-17cf-4d87-bc97-16ee00476739 service nova] Lock "f114d70b-3524-4f1c-b1af-71ae3235d040-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1193.623435] env[60400]: DEBUG oslo_concurrency.lockutils [req-fdd6de81-6663-475f-bdbb-de2e463681df req-9358cb81-17cf-4d87-bc97-16ee00476739 service nova] Lock "f114d70b-3524-4f1c-b1af-71ae3235d040-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1193.623591] env[60400]: DEBUG nova.compute.manager [req-fdd6de81-6663-475f-bdbb-de2e463681df req-9358cb81-17cf-4d87-bc97-16ee00476739 service nova] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] No waiting events found dispatching network-vif-plugged-de9c125c-8971-4a4d-afe3-0b364b13d48d {{(pid=60400) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1193.623777] env[60400]: WARNING nova.compute.manager [req-fdd6de81-6663-475f-bdbb-de2e463681df req-9358cb81-17cf-4d87-bc97-16ee00476739 service nova] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Received unexpected event network-vif-plugged-de9c125c-8971-4a4d-afe3-0b364b13d48d for instance with vm_state building and task_state spawning. [ 1193.694715] env[60400]: DEBUG nova.network.neutron [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Successfully updated port: de9c125c-8971-4a4d-afe3-0b364b13d48d {{(pid=60400) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1193.703052] env[60400]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Acquiring lock "refresh_cache-f114d70b-3524-4f1c-b1af-71ae3235d040" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1193.703202] env[60400]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Acquired lock "refresh_cache-f114d70b-3524-4f1c-b1af-71ae3235d040" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1193.703346] env[60400]: DEBUG nova.network.neutron [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Building network info cache for instance {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} [ 1193.739385] env[60400]: DEBUG nova.network.neutron [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Instance cache missing network info. {{(pid=60400) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 1193.889167] env[60400]: DEBUG nova.network.neutron [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Updating instance_info_cache with network_info: [{"id": "de9c125c-8971-4a4d-afe3-0b364b13d48d", "address": "fa:16:3e:d3:80:1f", "network": {"id": "7e7c0226-589d-4fd7-868b-b8c752206c39", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1977982493-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "07237bcd8b47450cae1f09b3c693038e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8b29df12-5674-476d-a9e5-5e20f704d224", "external-id": "nsx-vlan-transportzone-754", "segmentation_id": 754, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapde9c125c-89", "ovs_interfaceid": "de9c125c-8971-4a4d-afe3-0b364b13d48d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1193.899437] env[60400]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Releasing lock "refresh_cache-f114d70b-3524-4f1c-b1af-71ae3235d040" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1193.899751] env[60400]: DEBUG nova.compute.manager [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Instance network_info: |[{"id": "de9c125c-8971-4a4d-afe3-0b364b13d48d", "address": "fa:16:3e:d3:80:1f", "network": {"id": "7e7c0226-589d-4fd7-868b-b8c752206c39", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1977982493-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "07237bcd8b47450cae1f09b3c693038e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8b29df12-5674-476d-a9e5-5e20f704d224", "external-id": "nsx-vlan-transportzone-754", "segmentation_id": 754, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapde9c125c-89", "ovs_interfaceid": "de9c125c-8971-4a4d-afe3-0b364b13d48d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60400) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1193.900095] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:d3:80:1f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '8b29df12-5674-476d-a9e5-5e20f704d224', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'de9c125c-8971-4a4d-afe3-0b364b13d48d', 'vif_model': 'vmxnet3'}] {{(pid=60400) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1193.907474] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Creating folder: Project (07237bcd8b47450cae1f09b3c693038e). Parent ref: group-v119075. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1193.907944] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-843991fa-d112-4212-938d-c78319ddd256 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.918641] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Created folder: Project (07237bcd8b47450cae1f09b3c693038e) in parent group-v119075. [ 1193.918802] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Creating folder: Instances. Parent ref: group-v119150. {{(pid=60400) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1193.919009] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6b57464e-f24d-44ec-9981-e22b1c3c214e {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.928047] env[60400]: INFO nova.virt.vmwareapi.vm_util [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Created folder: Instances in parent group-v119150. [ 1193.928250] env[60400]: DEBUG oslo.service.loopingcall [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60400) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1193.928407] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Creating VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1193.928575] env[60400]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-793c6827-0cf9-4762-8c1c-68bfb19de46b {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.946756] env[60400]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1193.946756] env[60400]: value = "task-449894" [ 1193.946756] env[60400]: _type = "Task" [ 1193.946756] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1193.953636] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449894, 'name': CreateVM_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1194.456504] env[60400]: DEBUG oslo_vmware.api [-] Task: {'id': task-449894, 'name': CreateVM_Task, 'duration_secs': 0.282981} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1194.456657] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Created VM on the ESX host {{(pid=60400) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1194.457615] env[60400]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1194.457615] env[60400]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1194.457835] env[60400]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1194.458100] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-94e956a4-044f-4212-8d59-14a3129363fc {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1194.462665] env[60400]: DEBUG oslo_vmware.api [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Waiting for the task: (returnval){ [ 1194.462665] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]5266365b-4316-29d1-6c9c-dace7a6a2f2c" [ 1194.462665] env[60400]: _type = "Task" [ 1194.462665] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1194.470341] env[60400]: DEBUG oslo_vmware.api [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]5266365b-4316-29d1-6c9c-dace7a6a2f2c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1194.973080] env[60400]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1194.973381] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Processing image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1194.973573] env[60400]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1195.648388] env[60400]: DEBUG nova.compute.manager [req-bc2eecca-1f01-4723-a5ee-989f17758f96 req-a2a95cae-a9ef-46b5-a397-f5a1d921413e service nova] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Received event network-changed-de9c125c-8971-4a4d-afe3-0b364b13d48d {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1195.648579] env[60400]: DEBUG nova.compute.manager [req-bc2eecca-1f01-4723-a5ee-989f17758f96 req-a2a95cae-a9ef-46b5-a397-f5a1d921413e service nova] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Refreshing instance network info cache due to event network-changed-de9c125c-8971-4a4d-afe3-0b364b13d48d. {{(pid=60400) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 1195.648781] env[60400]: DEBUG oslo_concurrency.lockutils [req-bc2eecca-1f01-4723-a5ee-989f17758f96 req-a2a95cae-a9ef-46b5-a397-f5a1d921413e service nova] Acquiring lock "refresh_cache-f114d70b-3524-4f1c-b1af-71ae3235d040" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1195.648994] env[60400]: DEBUG oslo_concurrency.lockutils [req-bc2eecca-1f01-4723-a5ee-989f17758f96 req-a2a95cae-a9ef-46b5-a397-f5a1d921413e service nova] Acquired lock "refresh_cache-f114d70b-3524-4f1c-b1af-71ae3235d040" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1195.649142] env[60400]: DEBUG nova.network.neutron [req-bc2eecca-1f01-4723-a5ee-989f17758f96 req-a2a95cae-a9ef-46b5-a397-f5a1d921413e service nova] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Refreshing network info cache for port de9c125c-8971-4a4d-afe3-0b364b13d48d {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} [ 1195.870162] env[60400]: DEBUG nova.network.neutron [req-bc2eecca-1f01-4723-a5ee-989f17758f96 req-a2a95cae-a9ef-46b5-a397-f5a1d921413e service nova] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Updated VIF entry in instance network info cache for port de9c125c-8971-4a4d-afe3-0b364b13d48d. {{(pid=60400) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} [ 1195.870514] env[60400]: DEBUG nova.network.neutron [req-bc2eecca-1f01-4723-a5ee-989f17758f96 req-a2a95cae-a9ef-46b5-a397-f5a1d921413e service nova] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Updating instance_info_cache with network_info: [{"id": "de9c125c-8971-4a4d-afe3-0b364b13d48d", "address": "fa:16:3e:d3:80:1f", "network": {"id": "7e7c0226-589d-4fd7-868b-b8c752206c39", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1977982493-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "07237bcd8b47450cae1f09b3c693038e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8b29df12-5674-476d-a9e5-5e20f704d224", "external-id": "nsx-vlan-transportzone-754", "segmentation_id": 754, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapde9c125c-89", "ovs_interfaceid": "de9c125c-8971-4a4d-afe3-0b364b13d48d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1195.879393] env[60400]: DEBUG oslo_concurrency.lockutils [req-bc2eecca-1f01-4723-a5ee-989f17758f96 req-a2a95cae-a9ef-46b5-a397-f5a1d921413e service nova] Releasing lock "refresh_cache-f114d70b-3524-4f1c-b1af-71ae3235d040" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1225.450196] env[60400]: WARNING oslo_vmware.rw_handles [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1225.450196] env[60400]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1225.450196] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1225.450196] env[60400]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1225.450196] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1225.450196] env[60400]: ERROR oslo_vmware.rw_handles response.begin() [ 1225.450196] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1225.450196] env[60400]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1225.450196] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1225.450196] env[60400]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1225.450196] env[60400]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1225.450196] env[60400]: ERROR oslo_vmware.rw_handles [ 1225.451401] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Downloaded image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to vmware_temp/b5b17d8d-e9b9-46b0-be2c-3036fc17384f/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1225.452661] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Caching image {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1225.452958] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Copying Virtual Disk [datastore1] vmware_temp/b5b17d8d-e9b9-46b0-be2c-3036fc17384f/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk to [datastore1] vmware_temp/b5b17d8d-e9b9-46b0-be2c-3036fc17384f/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk {{(pid=60400) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1225.453289] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a2596add-1890-4036-ae06-4f9677fa274d {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1225.461633] env[60400]: DEBUG oslo_vmware.api [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Waiting for the task: (returnval){ [ 1225.461633] env[60400]: value = "task-449895" [ 1225.461633] env[60400]: _type = "Task" [ 1225.461633] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1225.469592] env[60400]: DEBUG oslo_vmware.api [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Task: {'id': task-449895, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1225.971853] env[60400]: DEBUG oslo_vmware.exceptions [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Fault InvalidArgument not matched. {{(pid=60400) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1225.972103] env[60400]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1225.972655] env[60400]: ERROR nova.compute.manager [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1225.972655] env[60400]: Faults: ['InvalidArgument'] [ 1225.972655] env[60400]: ERROR nova.compute.manager [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Traceback (most recent call last): [ 1225.972655] env[60400]: ERROR nova.compute.manager [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1225.972655] env[60400]: ERROR nova.compute.manager [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] yield resources [ 1225.972655] env[60400]: ERROR nova.compute.manager [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1225.972655] env[60400]: ERROR nova.compute.manager [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] self.driver.spawn(context, instance, image_meta, [ 1225.972655] env[60400]: ERROR nova.compute.manager [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1225.972655] env[60400]: ERROR nova.compute.manager [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1225.972655] env[60400]: ERROR nova.compute.manager [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1225.972655] env[60400]: ERROR nova.compute.manager [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] self._fetch_image_if_missing(context, vi) [ 1225.972655] env[60400]: ERROR nova.compute.manager [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1225.972996] env[60400]: ERROR nova.compute.manager [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] image_cache(vi, tmp_image_ds_loc) [ 1225.972996] env[60400]: ERROR nova.compute.manager [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1225.972996] env[60400]: ERROR nova.compute.manager [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] vm_util.copy_virtual_disk( [ 1225.972996] env[60400]: ERROR nova.compute.manager [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1225.972996] env[60400]: ERROR nova.compute.manager [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] session._wait_for_task(vmdk_copy_task) [ 1225.972996] env[60400]: ERROR nova.compute.manager [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1225.972996] env[60400]: ERROR nova.compute.manager [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] return self.wait_for_task(task_ref) [ 1225.972996] env[60400]: ERROR nova.compute.manager [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1225.972996] env[60400]: ERROR nova.compute.manager [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] return evt.wait() [ 1225.972996] env[60400]: ERROR nova.compute.manager [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1225.972996] env[60400]: ERROR nova.compute.manager [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] result = hub.switch() [ 1225.972996] env[60400]: ERROR nova.compute.manager [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1225.972996] env[60400]: ERROR nova.compute.manager [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] return self.greenlet.switch() [ 1225.973461] env[60400]: ERROR nova.compute.manager [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1225.973461] env[60400]: ERROR nova.compute.manager [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] self.f(*self.args, **self.kw) [ 1225.973461] env[60400]: ERROR nova.compute.manager [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1225.973461] env[60400]: ERROR nova.compute.manager [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] raise exceptions.translate_fault(task_info.error) [ 1225.973461] env[60400]: ERROR nova.compute.manager [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1225.973461] env[60400]: ERROR nova.compute.manager [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Faults: ['InvalidArgument'] [ 1225.973461] env[60400]: ERROR nova.compute.manager [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] [ 1225.973461] env[60400]: INFO nova.compute.manager [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Terminating instance [ 1225.974505] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1225.974697] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1225.974914] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2360dfdc-0798-4609-9f4f-acc8668be927 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1225.977194] env[60400]: DEBUG nova.compute.manager [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Start destroying the instance on the hypervisor. {{(pid=60400) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1225.977384] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Destroying instance {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1225.978099] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4adfe0bc-d2f6-4169-8e8b-ea8bbc8b56ad {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1225.984591] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Unregistering the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1225.984820] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e1b43df3-c589-4e90-a9fd-985cab4f8919 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1225.986840] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1225.987012] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60400) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1225.987937] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0664dd05-5b8e-4157-a532-7bf47e86bb60 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1225.992564] env[60400]: DEBUG oslo_vmware.api [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Waiting for the task: (returnval){ [ 1225.992564] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]5245ab99-c0fd-4044-91af-dd1ff67b1a39" [ 1225.992564] env[60400]: _type = "Task" [ 1225.992564] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1225.999426] env[60400]: DEBUG oslo_vmware.api [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]5245ab99-c0fd-4044-91af-dd1ff67b1a39, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1226.052139] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Unregistered the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1226.052391] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Deleting contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1226.052604] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Deleting the datastore file [datastore1] b5ad6145-8bf0-4aed-951b-eb11dd87ed7d {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1226.052872] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9b097c97-4d82-4698-bd59-a259483258e5 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1226.060100] env[60400]: DEBUG oslo_vmware.api [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Waiting for the task: (returnval){ [ 1226.060100] env[60400]: value = "task-449897" [ 1226.060100] env[60400]: _type = "Task" [ 1226.060100] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1226.067584] env[60400]: DEBUG oslo_vmware.api [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Task: {'id': task-449897, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1226.502750] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Preparing fetch location {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1226.503173] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Creating directory with path [datastore1] vmware_temp/5c454ef6-6e43-463f-ab8e-988a8373467b/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1226.503173] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4ccf4957-67e3-47af-97f4-13eeea635dc9 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1226.514328] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Created directory with path [datastore1] vmware_temp/5c454ef6-6e43-463f-ab8e-988a8373467b/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1226.514506] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Fetch image to [datastore1] vmware_temp/5c454ef6-6e43-463f-ab8e-988a8373467b/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1226.514650] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to [datastore1] vmware_temp/5c454ef6-6e43-463f-ab8e-988a8373467b/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1226.515330] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ac727a6-7793-48c4-91c1-1e66123a4500 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1226.521576] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46501744-8f71-454d-95f3-44859d81cb03 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1226.530158] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a543727e-1eb8-4f15-b976-0f6e18b7f859 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1226.560248] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69a17533-8cf9-497c-85cc-c5e63a6e9a3f {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1226.570090] env[60400]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f966644b-7d97-4f92-abfc-d9b2c2afd872 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1226.571645] env[60400]: DEBUG oslo_vmware.api [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Task: {'id': task-449897, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079408} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1226.571861] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Deleted the datastore file {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1226.572067] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Deleted contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1226.572212] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Instance destroyed {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1226.572376] env[60400]: INFO nova.compute.manager [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1226.574470] env[60400]: DEBUG nova.compute.claims [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Aborting claim: {{(pid=60400) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1226.574580] env[60400]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1226.574790] env[60400]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1226.592372] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1226.602573] env[60400]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.028s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1226.603230] env[60400]: DEBUG nova.compute.utils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Instance b5ad6145-8bf0-4aed-951b-eb11dd87ed7d could not be found. {{(pid=60400) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1226.604622] env[60400]: DEBUG nova.compute.manager [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Instance disappeared during build. {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1226.604784] env[60400]: DEBUG nova.compute.manager [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Unplugging VIFs for instance {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1226.604941] env[60400]: DEBUG nova.compute.manager [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1226.605114] env[60400]: DEBUG nova.compute.manager [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Deallocating network for instance {{(pid=60400) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1226.605283] env[60400]: DEBUG nova.network.neutron [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] deallocate_for_instance() {{(pid=60400) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 1226.630515] env[60400]: DEBUG nova.network.neutron [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Updating instance_info_cache with network_info: [] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1226.638547] env[60400]: DEBUG oslo_vmware.rw_handles [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5c454ef6-6e43-463f-ab8e-988a8373467b/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60400) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1226.640611] env[60400]: INFO nova.compute.manager [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Took 0.04 seconds to deallocate network for instance. [ 1226.696635] env[60400]: DEBUG oslo_vmware.rw_handles [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Completed reading data from the image iterator. {{(pid=60400) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1226.696805] env[60400]: DEBUG oslo_vmware.rw_handles [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5c454ef6-6e43-463f-ab8e-988a8373467b/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60400) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1226.714611] env[60400]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "b5ad6145-8bf0-4aed-951b-eb11dd87ed7d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 277.587s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1244.188581] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1246.935207] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1246.935513] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1247.928574] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1247.932237] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1247.932389] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Starting heal instance info cache {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 1247.932506] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Rebuilding the list of instances to heal {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 1247.944250] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1247.944533] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1247.944533] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Didn't find any instances for network info cache update. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 1247.945385] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1249.932865] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1249.933248] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1249.933290] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60400) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 1252.933123] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1252.943013] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1252.943271] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1252.943432] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1252.943585] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60400) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1252.944679] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-948304cc-8957-4871-866b-41aa81e5748d {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1252.953404] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0010a6e5-fd6d-43eb-b1c9-065e4a73d5a5 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1252.966962] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d327b72-db49-465c-85cd-4531763dd464 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1252.973115] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-675e1d43-3bdb-490e-8408-49981ca18a93 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1253.002279] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181645MB free_disk=118GB free_vcpus=48 pci_devices=None {{(pid=60400) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1253.002388] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1253.003021] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1253.040217] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance 0257c136-6f30-43ae-8f8d-e8f23d8328ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 1253.040361] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance f114d70b-3524-4f1c-b1af-71ae3235d040 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 1253.040530] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Total usable vcpus: 48, total allocated vcpus: 2 {{(pid=60400) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1253.040663] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=768MB phys_disk=200GB used_disk=2GB total_vcpus=48 used_vcpus=2 pci_stats=[] {{(pid=60400) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1253.073185] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc3a65e5-1e19-4971-8a40-72d8a16d2dd9 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1253.079753] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27f7216e-0f24-4af5-b3db-6a339c0810c3 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1253.108453] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac86cbcc-ef5e-4883-bf79-8ec46592a07a {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1253.114815] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b062a1f-f789-4c0b-95b4-102e2d9a003c {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1253.128397] env[60400]: DEBUG nova.compute.provider_tree [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1253.135919] env[60400]: DEBUG nova.scheduler.client.report [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1253.147889] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60400) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1253.148128] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.145s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1274.345454] env[60400]: WARNING oslo_vmware.rw_handles [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1274.345454] env[60400]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1274.345454] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1274.345454] env[60400]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1274.345454] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1274.345454] env[60400]: ERROR oslo_vmware.rw_handles response.begin() [ 1274.345454] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1274.345454] env[60400]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1274.345454] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1274.345454] env[60400]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1274.345454] env[60400]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1274.345454] env[60400]: ERROR oslo_vmware.rw_handles [ 1274.346283] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Downloaded image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to vmware_temp/5c454ef6-6e43-463f-ab8e-988a8373467b/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1274.347926] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Caching image {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1274.348183] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Copying Virtual Disk [datastore1] vmware_temp/5c454ef6-6e43-463f-ab8e-988a8373467b/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk to [datastore1] vmware_temp/5c454ef6-6e43-463f-ab8e-988a8373467b/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk {{(pid=60400) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1274.348505] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-54992bce-10e0-48fb-b87a-fd40d2139274 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1274.358783] env[60400]: DEBUG oslo_vmware.api [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Waiting for the task: (returnval){ [ 1274.358783] env[60400]: value = "task-449898" [ 1274.358783] env[60400]: _type = "Task" [ 1274.358783] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1274.366872] env[60400]: DEBUG oslo_vmware.api [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Task: {'id': task-449898, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1274.869119] env[60400]: DEBUG oslo_vmware.exceptions [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Fault InvalidArgument not matched. {{(pid=60400) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1274.869302] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1274.869868] env[60400]: ERROR nova.compute.manager [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1274.869868] env[60400]: Faults: ['InvalidArgument'] [ 1274.869868] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Traceback (most recent call last): [ 1274.869868] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1274.869868] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] yield resources [ 1274.869868] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1274.869868] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] self.driver.spawn(context, instance, image_meta, [ 1274.869868] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1274.869868] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1274.869868] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1274.869868] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] self._fetch_image_if_missing(context, vi) [ 1274.869868] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1274.870453] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] image_cache(vi, tmp_image_ds_loc) [ 1274.870453] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1274.870453] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] vm_util.copy_virtual_disk( [ 1274.870453] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1274.870453] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] session._wait_for_task(vmdk_copy_task) [ 1274.870453] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1274.870453] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] return self.wait_for_task(task_ref) [ 1274.870453] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1274.870453] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] return evt.wait() [ 1274.870453] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1274.870453] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] result = hub.switch() [ 1274.870453] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1274.870453] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] return self.greenlet.switch() [ 1274.870804] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1274.870804] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] self.f(*self.args, **self.kw) [ 1274.870804] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1274.870804] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] raise exceptions.translate_fault(task_info.error) [ 1274.870804] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1274.870804] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Faults: ['InvalidArgument'] [ 1274.870804] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] [ 1274.870804] env[60400]: INFO nova.compute.manager [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Terminating instance [ 1274.871728] env[60400]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1274.871925] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1274.872230] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ad1bb3e5-d0e2-4b6e-ae0f-7c9ea1f65658 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1274.874362] env[60400]: DEBUG nova.compute.manager [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Start destroying the instance on the hypervisor. {{(pid=60400) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1274.874552] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Destroying instance {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1274.875315] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4765c0ab-bd80-4364-b146-aefedd061691 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1274.882178] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Unregistering the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1274.882416] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-89a48c37-c7c7-4a99-89aa-e1a7890deea4 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1274.884505] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1274.884669] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60400) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1274.885602] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-84cdc0e9-52c0-4fe7-87ff-7098b886f1b4 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1274.891104] env[60400]: DEBUG oslo_vmware.api [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Waiting for the task: (returnval){ [ 1274.891104] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52de5cbf-bbed-fac5-4142-78aa221e5428" [ 1274.891104] env[60400]: _type = "Task" [ 1274.891104] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1274.898056] env[60400]: DEBUG oslo_vmware.api [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52de5cbf-bbed-fac5-4142-78aa221e5428, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1274.955659] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Unregistered the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1274.955850] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Deleting contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1274.956052] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Deleting the datastore file [datastore1] 0257c136-6f30-43ae-8f8d-e8f23d8328ef {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1274.956288] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7f67badb-a7ab-495b-bab4-3891a5f524f4 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1274.962934] env[60400]: DEBUG oslo_vmware.api [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Waiting for the task: (returnval){ [ 1274.962934] env[60400]: value = "task-449900" [ 1274.962934] env[60400]: _type = "Task" [ 1274.962934] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1274.970153] env[60400]: DEBUG oslo_vmware.api [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Task: {'id': task-449900, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1275.400734] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Preparing fetch location {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1275.401072] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Creating directory with path [datastore1] vmware_temp/c70f5f07-a87f-4b75-b9c7-ed3dc948a486/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1275.401192] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8ca8e646-9197-47ad-ac8c-1e46a9872b00 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1275.412596] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Created directory with path [datastore1] vmware_temp/c70f5f07-a87f-4b75-b9c7-ed3dc948a486/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1275.412778] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Fetch image to [datastore1] vmware_temp/c70f5f07-a87f-4b75-b9c7-ed3dc948a486/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1275.413024] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to [datastore1] vmware_temp/c70f5f07-a87f-4b75-b9c7-ed3dc948a486/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1275.413935] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35d6e025-c91d-4830-ba4b-3520bf14b36e {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1275.421327] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b660571-7c4f-4fe0-acc7-0ecf8f33b3cf {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1275.431225] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8914e6be-7156-4902-8ed0-8ff1e164c9aa {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1275.462997] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4eb233b0-ef9b-401c-912d-f4b3daeb48e5 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1275.474020] env[60400]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-65f0956f-b26b-4244-878e-493ef84cb26a {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1275.475641] env[60400]: DEBUG oslo_vmware.api [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Task: {'id': task-449900, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069456} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1275.475856] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Deleted the datastore file {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1275.476039] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Deleted contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1275.476206] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Instance destroyed {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1275.476374] env[60400]: INFO nova.compute.manager [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1275.478567] env[60400]: DEBUG nova.compute.claims [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Aborting claim: {{(pid=60400) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1275.478664] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1275.478857] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1275.497359] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1275.548829] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-676b5724-35e8-4928-ba7e-cd36accc1508 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1275.557386] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7fb6acb-2f53-4bf2-b10d-7c468837aeac {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1275.589172] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73d94125-e08b-4a28-9503-e344e54fe205 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1275.596896] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-186bc714-ede6-4bec-8e8b-2d33a81901b1 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1275.611901] env[60400]: DEBUG nova.compute.provider_tree [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1275.620414] env[60400]: DEBUG nova.scheduler.client.report [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1275.633499] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.155s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1275.634017] env[60400]: ERROR nova.compute.manager [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1275.634017] env[60400]: Faults: ['InvalidArgument'] [ 1275.634017] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Traceback (most recent call last): [ 1275.634017] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1275.634017] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] self.driver.spawn(context, instance, image_meta, [ 1275.634017] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1275.634017] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1275.634017] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1275.634017] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] self._fetch_image_if_missing(context, vi) [ 1275.634017] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1275.634017] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] image_cache(vi, tmp_image_ds_loc) [ 1275.634017] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1275.634443] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] vm_util.copy_virtual_disk( [ 1275.634443] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1275.634443] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] session._wait_for_task(vmdk_copy_task) [ 1275.634443] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1275.634443] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] return self.wait_for_task(task_ref) [ 1275.634443] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1275.634443] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] return evt.wait() [ 1275.634443] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1275.634443] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] result = hub.switch() [ 1275.634443] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1275.634443] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] return self.greenlet.switch() [ 1275.634443] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1275.634443] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] self.f(*self.args, **self.kw) [ 1275.634747] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1275.634747] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] raise exceptions.translate_fault(task_info.error) [ 1275.634747] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1275.634747] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Faults: ['InvalidArgument'] [ 1275.634747] env[60400]: ERROR nova.compute.manager [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] [ 1275.634747] env[60400]: DEBUG nova.compute.utils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] VimFaultException {{(pid=60400) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1275.636113] env[60400]: DEBUG nova.compute.manager [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Build of instance 0257c136-6f30-43ae-8f8d-e8f23d8328ef was re-scheduled: A specified parameter was not correct: fileType [ 1275.636113] env[60400]: Faults: ['InvalidArgument'] {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1275.636476] env[60400]: DEBUG nova.compute.manager [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Unplugging VIFs for instance {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1275.636643] env[60400]: DEBUG nova.compute.manager [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1275.636807] env[60400]: DEBUG nova.compute.manager [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Deallocating network for instance {{(pid=60400) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1275.636966] env[60400]: DEBUG nova.network.neutron [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] deallocate_for_instance() {{(pid=60400) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 1275.651691] env[60400]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1275.652404] env[60400]: ERROR nova.compute.manager [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9. [ 1275.652404] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Traceback (most recent call last): [ 1275.652404] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1275.652404] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1275.652404] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1275.652404] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] result = getattr(controller, method)(*args, **kwargs) [ 1275.652404] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1275.652404] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] return self._get(image_id) [ 1275.652404] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1275.652404] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1275.652404] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1275.652713] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] resp, body = self.http_client.get(url, headers=header) [ 1275.652713] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1275.652713] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] return self.request(url, 'GET', **kwargs) [ 1275.652713] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1275.652713] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] return self._handle_response(resp) [ 1275.652713] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1275.652713] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] raise exc.from_response(resp, resp.content) [ 1275.652713] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1275.652713] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] [ 1275.652713] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] During handling of the above exception, another exception occurred: [ 1275.652713] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] [ 1275.652713] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Traceback (most recent call last): [ 1275.653298] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1275.653298] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] yield resources [ 1275.653298] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1275.653298] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] self.driver.spawn(context, instance, image_meta, [ 1275.653298] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1275.653298] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1275.653298] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1275.653298] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] self._fetch_image_if_missing(context, vi) [ 1275.653298] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1275.653298] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] image_fetch(context, vi, tmp_image_ds_loc) [ 1275.653298] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1275.653298] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] images.fetch_image( [ 1275.653298] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1275.653729] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] metadata = IMAGE_API.get(context, image_ref) [ 1275.653729] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1275.653729] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] return session.show(context, image_id, [ 1275.653729] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1275.653729] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] _reraise_translated_image_exception(image_id) [ 1275.653729] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1275.653729] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] raise new_exc.with_traceback(exc_trace) [ 1275.653729] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1275.653729] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1275.653729] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1275.653729] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] result = getattr(controller, method)(*args, **kwargs) [ 1275.653729] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1275.653729] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] return self._get(image_id) [ 1275.654063] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1275.654063] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1275.654063] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1275.654063] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] resp, body = self.http_client.get(url, headers=header) [ 1275.654063] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1275.654063] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] return self.request(url, 'GET', **kwargs) [ 1275.654063] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1275.654063] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] return self._handle_response(resp) [ 1275.654063] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1275.654063] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] raise exc.from_response(resp, resp.content) [ 1275.654063] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] nova.exception.ImageNotAuthorized: Not authorized for image f5dfd970-7a56-4489-873c-2c3b6fbd9fe9. [ 1275.654063] env[60400]: ERROR nova.compute.manager [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] [ 1275.654430] env[60400]: INFO nova.compute.manager [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Terminating instance [ 1275.655426] env[60400]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1275.655627] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1275.656098] env[60400]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquiring lock "refresh_cache-d97a55c5-f248-482a-9986-212e84bdd0b0" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1275.656252] env[60400]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquired lock "refresh_cache-d97a55c5-f248-482a-9986-212e84bdd0b0" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1275.656406] env[60400]: DEBUG nova.network.neutron [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Building network info cache for instance {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} [ 1275.657229] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-bf79b99e-7074-4fe3-92b2-5109706c22b2 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1275.662866] env[60400]: DEBUG nova.compute.utils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Can not refresh info_cache because instance was not found {{(pid=60400) refresh_info_cache_for_instance /opt/stack/nova/nova/compute/utils.py:1010}} [ 1275.665836] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1275.666010] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60400) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1275.668526] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-325b5d9c-9210-4444-9690-cb9b7c7655cc {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1275.674330] env[60400]: DEBUG oslo_vmware.api [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Waiting for the task: (returnval){ [ 1275.674330] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52ac253f-6317-c5de-6b29-540540f88afb" [ 1275.674330] env[60400]: _type = "Task" [ 1275.674330] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1275.681958] env[60400]: DEBUG oslo_vmware.api [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52ac253f-6317-c5de-6b29-540540f88afb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1275.711760] env[60400]: DEBUG nova.network.neutron [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Instance cache missing network info. {{(pid=60400) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 1275.909176] env[60400]: DEBUG nova.network.neutron [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Updating instance_info_cache with network_info: [] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1275.917489] env[60400]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Releasing lock "refresh_cache-d97a55c5-f248-482a-9986-212e84bdd0b0" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1275.918070] env[60400]: DEBUG nova.compute.manager [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Start destroying the instance on the hypervisor. {{(pid=60400) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1275.918324] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Destroying instance {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1275.919425] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96c9de04-152a-4531-98be-7045fdf7ca9d {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1275.927627] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Unregistering the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1275.927872] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-464b9c01-3701-43a4-b3da-6dfa38763a3c {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1275.951629] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Unregistered the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1275.951826] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Deleting contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1275.952080] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Deleting the datastore file [datastore1] d97a55c5-f248-482a-9986-212e84bdd0b0 {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1275.952602] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-bbdd080f-337d-4d6a-a67a-deeefdf27cb9 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1275.958496] env[60400]: DEBUG oslo_vmware.api [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Waiting for the task: (returnval){ [ 1275.958496] env[60400]: value = "task-449902" [ 1275.958496] env[60400]: _type = "Task" [ 1275.958496] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1275.966416] env[60400]: DEBUG oslo_vmware.api [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Task: {'id': task-449902, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1276.057175] env[60400]: DEBUG nova.network.neutron [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Updating instance_info_cache with network_info: [] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1276.066859] env[60400]: INFO nova.compute.manager [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Took 0.43 seconds to deallocate network for instance. [ 1276.147838] env[60400]: INFO nova.scheduler.client.report [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Deleted allocations for instance 0257c136-6f30-43ae-8f8d-e8f23d8328ef [ 1276.164079] env[60400]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Lock "0257c136-6f30-43ae-8f8d-e8f23d8328ef" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 389.246s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1276.164306] env[60400]: DEBUG oslo_concurrency.lockutils [None req-c1ca1cf6-2c7e-4ee0-a6f5-1b2038693288 tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Lock "0257c136-6f30-43ae-8f8d-e8f23d8328ef" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 193.320s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1276.164519] env[60400]: DEBUG oslo_concurrency.lockutils [None req-c1ca1cf6-2c7e-4ee0-a6f5-1b2038693288 tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Acquiring lock "0257c136-6f30-43ae-8f8d-e8f23d8328ef-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1276.164714] env[60400]: DEBUG oslo_concurrency.lockutils [None req-c1ca1cf6-2c7e-4ee0-a6f5-1b2038693288 tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Lock "0257c136-6f30-43ae-8f8d-e8f23d8328ef-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1276.164893] env[60400]: DEBUG oslo_concurrency.lockutils [None req-c1ca1cf6-2c7e-4ee0-a6f5-1b2038693288 tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Lock "0257c136-6f30-43ae-8f8d-e8f23d8328ef-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1276.166710] env[60400]: INFO nova.compute.manager [None req-c1ca1cf6-2c7e-4ee0-a6f5-1b2038693288 tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Terminating instance [ 1276.168456] env[60400]: DEBUG nova.compute.manager [None req-c1ca1cf6-2c7e-4ee0-a6f5-1b2038693288 tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Start destroying the instance on the hypervisor. {{(pid=60400) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1276.168644] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-c1ca1cf6-2c7e-4ee0-a6f5-1b2038693288 tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Destroying instance {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1276.169095] env[60400]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b89f4497-da75-4310-b6f5-6cbe9df1c9eb {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1276.180665] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35f0b9b7-8787-4186-adcd-8e80db2243c6 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1276.198827] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Preparing fetch location {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1276.199066] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Creating directory with path [datastore1] vmware_temp/f18a138d-a7a9-4415-92e9-fd95d9f755d2/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1276.199270] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0dc06f92-3122-4484-a3f5-79547dc8ff48 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1276.205528] env[60400]: WARNING nova.virt.vmwareapi.vmops [None req-c1ca1cf6-2c7e-4ee0-a6f5-1b2038693288 tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 0257c136-6f30-43ae-8f8d-e8f23d8328ef could not be found. [ 1276.205708] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-c1ca1cf6-2c7e-4ee0-a6f5-1b2038693288 tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Instance destroyed {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1276.205875] env[60400]: INFO nova.compute.manager [None req-c1ca1cf6-2c7e-4ee0-a6f5-1b2038693288 tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1276.206116] env[60400]: DEBUG oslo.service.loopingcall [None req-c1ca1cf6-2c7e-4ee0-a6f5-1b2038693288 tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60400) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1276.206570] env[60400]: DEBUG nova.compute.manager [-] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Deallocating network for instance {{(pid=60400) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1276.206665] env[60400]: DEBUG nova.network.neutron [-] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] deallocate_for_instance() {{(pid=60400) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 1276.215892] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Created directory with path [datastore1] vmware_temp/f18a138d-a7a9-4415-92e9-fd95d9f755d2/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1276.216115] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Fetch image to [datastore1] vmware_temp/f18a138d-a7a9-4415-92e9-fd95d9f755d2/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1276.216286] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to [datastore1] vmware_temp/f18a138d-a7a9-4415-92e9-fd95d9f755d2/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1276.216962] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24b6a26c-942f-43b8-a933-8e511e99bd63 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1276.223165] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bcba8aa2-e78e-4661-9bbd-dca63325ce7c {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1276.233316] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04778ece-bc7c-4325-86d0-453511123208 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1276.265284] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15839f71-49c1-4103-a4f9-008a01b42878 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1276.267732] env[60400]: DEBUG nova.network.neutron [-] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Updating instance_info_cache with network_info: [] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1276.272492] env[60400]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-d4f8bd64-82f4-47a4-8c70-4efc266575df {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1276.275918] env[60400]: INFO nova.compute.manager [-] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Took 0.07 seconds to deallocate network for instance. [ 1276.293414] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1276.340430] env[60400]: DEBUG oslo_vmware.rw_handles [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f18a138d-a7a9-4415-92e9-fd95d9f755d2/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60400) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1276.393589] env[60400]: DEBUG oslo_concurrency.lockutils [None req-c1ca1cf6-2c7e-4ee0-a6f5-1b2038693288 tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Lock "0257c136-6f30-43ae-8f8d-e8f23d8328ef" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.229s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1276.394391] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "0257c136-6f30-43ae-8f8d-e8f23d8328ef" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 89.264s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1276.394571] env[60400]: INFO nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] During sync_power_state the instance has a pending task (deleting). Skip. [ 1276.394736] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "0257c136-6f30-43ae-8f8d-e8f23d8328ef" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1276.396986] env[60400]: DEBUG oslo_vmware.rw_handles [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Completed reading data from the image iterator. {{(pid=60400) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1276.397163] env[60400]: DEBUG oslo_vmware.rw_handles [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f18a138d-a7a9-4415-92e9-fd95d9f755d2/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60400) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1276.468644] env[60400]: DEBUG oslo_vmware.api [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Task: {'id': task-449902, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.034713} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1276.468983] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Deleted the datastore file {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1276.469084] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Deleted contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1276.469186] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Instance destroyed {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1276.469349] env[60400]: INFO nova.compute.manager [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Took 0.55 seconds to destroy the instance on the hypervisor. [ 1276.469562] env[60400]: DEBUG oslo.service.loopingcall [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60400) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1276.469760] env[60400]: DEBUG nova.compute.manager [-] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Skipping network deallocation for instance since networking was not requested. {{(pid=60400) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 1276.471641] env[60400]: DEBUG nova.compute.claims [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Aborting claim: {{(pid=60400) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1276.471799] env[60400]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1276.472083] env[60400]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1276.494635] env[60400]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.023s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1276.495270] env[60400]: DEBUG nova.compute.utils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Instance d97a55c5-f248-482a-9986-212e84bdd0b0 could not be found. {{(pid=60400) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1276.496475] env[60400]: DEBUG nova.compute.manager [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Instance disappeared during build. {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1276.496636] env[60400]: DEBUG nova.compute.manager [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Unplugging VIFs for instance {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1276.496838] env[60400]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquiring lock "refresh_cache-d97a55c5-f248-482a-9986-212e84bdd0b0" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1276.496976] env[60400]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquired lock "refresh_cache-d97a55c5-f248-482a-9986-212e84bdd0b0" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1276.497141] env[60400]: DEBUG nova.network.neutron [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Building network info cache for instance {{(pid=60400) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} [ 1276.503199] env[60400]: DEBUG nova.compute.utils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Can not refresh info_cache because instance was not found {{(pid=60400) refresh_info_cache_for_instance /opt/stack/nova/nova/compute/utils.py:1010}} [ 1276.519403] env[60400]: DEBUG nova.network.neutron [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Instance cache missing network info. {{(pid=60400) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 1276.576128] env[60400]: DEBUG nova.network.neutron [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Updating instance_info_cache with network_info: [] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1276.585248] env[60400]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Releasing lock "refresh_cache-d97a55c5-f248-482a-9986-212e84bdd0b0" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1276.585456] env[60400]: DEBUG nova.compute.manager [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1276.585626] env[60400]: DEBUG nova.compute.manager [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Skipping network deallocation for instance since networking was not requested. {{(pid=60400) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 1276.624872] env[60400]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "d97a55c5-f248-482a-9986-212e84bdd0b0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 324.892s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1306.149154] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1307.928722] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1307.932310] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1308.933357] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1308.933823] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Starting heal instance info cache {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 1308.933823] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Rebuilding the list of instances to heal {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 1308.944557] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1308.944725] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Didn't find any instances for network info cache update. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 1308.945140] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1308.945306] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1309.933500] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1310.928699] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1310.940988] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1310.941360] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60400) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 1313.933586] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1313.944028] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1313.944177] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1313.944322] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1313.944453] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60400) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1313.945484] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f05ff249-b2e3-49b5-9129-37cd1a167677 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1313.954343] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71e0dbe6-5427-460c-9c6f-9c8b15b4c849 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1313.967840] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7601798-dea5-4bde-8563-0142e732198d {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1313.973981] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acf76bf7-09c9-4922-be36-a6c34b6086db {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1314.002128] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181656MB free_disk=118GB free_vcpus=48 pci_devices=None {{(pid=60400) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1314.002253] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1314.002426] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1314.037576] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Instance f114d70b-3524-4f1c-b1af-71ae3235d040 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60400) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} [ 1314.038022] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=60400) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1314.038022] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=60400) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1314.061852] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f813fec3-8a11-4a0c-940e-836f41b6a849 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1314.068751] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3433f690-7c9d-4c55-8b82-3509de7e94ba {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1314.097533] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f75a631d-5126-4f22-8a65-33204fb2971c {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1314.104107] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca5812a4-9448-4eea-ae0e-1e4830a108a2 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1314.116427] env[60400]: DEBUG nova.compute.provider_tree [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1314.124185] env[60400]: DEBUG nova.scheduler.client.report [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1314.138700] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60400) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1314.138864] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.136s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1321.670811] env[60400]: WARNING oslo_vmware.rw_handles [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1321.670811] env[60400]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1321.670811] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1321.670811] env[60400]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1321.670811] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1321.670811] env[60400]: ERROR oslo_vmware.rw_handles response.begin() [ 1321.670811] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1321.670811] env[60400]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1321.670811] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1321.670811] env[60400]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1321.670811] env[60400]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1321.670811] env[60400]: ERROR oslo_vmware.rw_handles [ 1321.671512] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Downloaded image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to vmware_temp/f18a138d-a7a9-4415-92e9-fd95d9f755d2/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1321.672920] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Caching image {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1321.673316] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Copying Virtual Disk [datastore1] vmware_temp/f18a138d-a7a9-4415-92e9-fd95d9f755d2/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk to [datastore1] vmware_temp/f18a138d-a7a9-4415-92e9-fd95d9f755d2/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk {{(pid=60400) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1321.673617] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-adba9bc6-82c8-4cc5-b188-6a931998bb51 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1321.682498] env[60400]: DEBUG oslo_vmware.api [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Waiting for the task: (returnval){ [ 1321.682498] env[60400]: value = "task-449903" [ 1321.682498] env[60400]: _type = "Task" [ 1321.682498] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1321.690046] env[60400]: DEBUG oslo_vmware.api [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Task: {'id': task-449903, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1322.193022] env[60400]: DEBUG oslo_vmware.exceptions [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Fault InvalidArgument not matched. {{(pid=60400) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1322.193258] env[60400]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1322.193786] env[60400]: ERROR nova.compute.manager [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1322.193786] env[60400]: Faults: ['InvalidArgument'] [ 1322.193786] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Traceback (most recent call last): [ 1322.193786] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1322.193786] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] yield resources [ 1322.193786] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1322.193786] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] self.driver.spawn(context, instance, image_meta, [ 1322.193786] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1322.193786] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1322.193786] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1322.193786] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] self._fetch_image_if_missing(context, vi) [ 1322.193786] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1322.194360] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] image_cache(vi, tmp_image_ds_loc) [ 1322.194360] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1322.194360] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] vm_util.copy_virtual_disk( [ 1322.194360] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1322.194360] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] session._wait_for_task(vmdk_copy_task) [ 1322.194360] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1322.194360] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] return self.wait_for_task(task_ref) [ 1322.194360] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1322.194360] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] return evt.wait() [ 1322.194360] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1322.194360] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] result = hub.switch() [ 1322.194360] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1322.194360] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] return self.greenlet.switch() [ 1322.194751] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1322.194751] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] self.f(*self.args, **self.kw) [ 1322.194751] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1322.194751] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] raise exceptions.translate_fault(task_info.error) [ 1322.194751] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1322.194751] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Faults: ['InvalidArgument'] [ 1322.194751] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] [ 1322.194751] env[60400]: INFO nova.compute.manager [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Terminating instance [ 1322.195587] env[60400]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Acquired lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1322.195808] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1322.196037] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f6110936-d642-48b9-ac2d-7187b6c2ec40 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1322.198172] env[60400]: DEBUG nova.compute.manager [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Start destroying the instance on the hypervisor. {{(pid=60400) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1322.198360] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Destroying instance {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1322.199059] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b054d2fa-8dc1-4519-be37-5d719668910c {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1322.205424] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Unregistering the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1322.205663] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3154be7f-29f6-43b7-ab36-3205fe523380 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1322.207619] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1322.207802] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60400) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1322.208754] env[60400]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a70b6bdb-67d7-47f7-90df-2cfd5ff8ae23 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1322.213069] env[60400]: DEBUG oslo_vmware.api [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Waiting for the task: (returnval){ [ 1322.213069] env[60400]: value = "session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52df0a4c-460d-913f-e26a-f798f7a1ff26" [ 1322.213069] env[60400]: _type = "Task" [ 1322.213069] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1322.221405] env[60400]: DEBUG oslo_vmware.api [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Task: {'id': session[52a123e0-c43e-be80-fc87-f6c8b7c83679]52df0a4c-460d-913f-e26a-f798f7a1ff26, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1322.270067] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Unregistered the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1322.270274] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Deleting contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1322.270450] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Deleting the datastore file [datastore1] e924a9ab-71c1-4efe-a217-b036ec785dc8 {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1322.270709] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4bba4e4d-102c-4bdd-8877-8375b5c5548a {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1322.277035] env[60400]: DEBUG oslo_vmware.api [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Waiting for the task: (returnval){ [ 1322.277035] env[60400]: value = "task-449905" [ 1322.277035] env[60400]: _type = "Task" [ 1322.277035] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1322.284418] env[60400]: DEBUG oslo_vmware.api [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Task: {'id': task-449905, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1322.723148] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Preparing fetch location {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1322.723542] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Creating directory with path [datastore1] vmware_temp/2c2881f0-13c7-40ff-8ea7-2cfa44fbf0b8/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1322.723614] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-24383fce-259d-4554-92a6-33244cd7ee41 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1322.734253] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Created directory with path [datastore1] vmware_temp/2c2881f0-13c7-40ff-8ea7-2cfa44fbf0b8/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 {{(pid=60400) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1322.734445] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Fetch image to [datastore1] vmware_temp/2c2881f0-13c7-40ff-8ea7-2cfa44fbf0b8/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1322.734582] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to [datastore1] vmware_temp/2c2881f0-13c7-40ff-8ea7-2cfa44fbf0b8/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1322.735271] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9a3885b-ff36-4b8e-99b3-8b2aaf5ea7d3 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1322.741573] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9c1c966-db97-4b32-9b5f-17f35323dbbc {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1322.750131] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b61e258-a07a-493b-83da-0a07a7b7f28d {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1322.782551] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fecc9e71-33e9-43c5-a437-94f28c3289dc {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1322.788883] env[60400]: DEBUG oslo_vmware.api [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Task: {'id': task-449905, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07617} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1322.790313] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Deleted the datastore file {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1322.790496] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Deleted contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1322.790659] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Instance destroyed {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1322.790823] env[60400]: INFO nova.compute.manager [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Took 0.59 seconds to destroy the instance on the hypervisor. [ 1322.792545] env[60400]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-bf28a73c-7063-4e0a-bbb2-f72072695788 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1322.794324] env[60400]: DEBUG nova.compute.claims [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Aborting claim: {{(pid=60400) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1322.794491] env[60400]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1322.794691] env[60400]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1322.815086] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Downloading image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1322.819867] env[60400]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.025s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1322.819997] env[60400]: DEBUG nova.compute.utils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Instance e924a9ab-71c1-4efe-a217-b036ec785dc8 could not be found. {{(pid=60400) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1322.821388] env[60400]: DEBUG nova.compute.manager [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Instance disappeared during build. {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1322.821548] env[60400]: DEBUG nova.compute.manager [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Unplugging VIFs for instance {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1322.821704] env[60400]: DEBUG nova.compute.manager [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1322.821860] env[60400]: DEBUG nova.compute.manager [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Deallocating network for instance {{(pid=60400) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1322.822065] env[60400]: DEBUG nova.network.neutron [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] deallocate_for_instance() {{(pid=60400) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 1322.858575] env[60400]: DEBUG oslo_vmware.rw_handles [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2c2881f0-13c7-40ff-8ea7-2cfa44fbf0b8/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60400) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1322.916426] env[60400]: DEBUG oslo_vmware.rw_handles [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Completed reading data from the image iterator. {{(pid=60400) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1322.916651] env[60400]: DEBUG oslo_vmware.rw_handles [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2c2881f0-13c7-40ff-8ea7-2cfa44fbf0b8/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60400) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1323.008503] env[60400]: DEBUG neutronclient.v2_0.client [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60400) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1323.009933] env[60400]: ERROR nova.compute.manager [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1323.009933] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Traceback (most recent call last): [ 1323.009933] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1323.009933] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] self.driver.spawn(context, instance, image_meta, [ 1323.009933] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1323.009933] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1323.009933] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1323.009933] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] self._fetch_image_if_missing(context, vi) [ 1323.009933] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1323.009933] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] image_cache(vi, tmp_image_ds_loc) [ 1323.009933] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1323.009933] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] vm_util.copy_virtual_disk( [ 1323.010305] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1323.010305] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] session._wait_for_task(vmdk_copy_task) [ 1323.010305] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1323.010305] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] return self.wait_for_task(task_ref) [ 1323.010305] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1323.010305] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] return evt.wait() [ 1323.010305] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1323.010305] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] result = hub.switch() [ 1323.010305] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1323.010305] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] return self.greenlet.switch() [ 1323.010305] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1323.010305] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] self.f(*self.args, **self.kw) [ 1323.010305] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1323.010677] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] raise exceptions.translate_fault(task_info.error) [ 1323.010677] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1323.010677] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Faults: ['InvalidArgument'] [ 1323.010677] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] [ 1323.010677] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] During handling of the above exception, another exception occurred: [ 1323.010677] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] [ 1323.010677] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Traceback (most recent call last): [ 1323.010677] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1323.010677] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] self._build_and_run_instance(context, instance, image, [ 1323.010677] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1323.010677] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] with excutils.save_and_reraise_exception(): [ 1323.010677] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1323.010677] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] self.force_reraise() [ 1323.010677] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1323.011131] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] raise self.value [ 1323.011131] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1323.011131] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] with self.rt.instance_claim(context, instance, node, allocs, [ 1323.011131] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1323.011131] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] self.abort() [ 1323.011131] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/compute/claims.py", line 85, in abort [ 1323.011131] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] self.tracker.abort_instance_claim(self.context, self.instance, [ 1323.011131] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1323.011131] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] return f(*args, **kwargs) [ 1323.011131] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1323.011131] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] self._unset_instance_host_and_node(instance) [ 1323.011131] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1323.011131] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] instance.save() [ 1323.011481] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1323.011481] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] updates, result = self.indirection_api.object_action( [ 1323.011481] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1323.011481] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] return cctxt.call(context, 'object_action', objinst=objinst, [ 1323.011481] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1323.011481] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] result = self.transport._send( [ 1323.011481] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1323.011481] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] return self._driver.send(target, ctxt, message, [ 1323.011481] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1323.011481] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1323.011481] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1323.011481] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] raise result [ 1323.011793] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] nova.exception_Remote.InstanceNotFound_Remote: Instance e924a9ab-71c1-4efe-a217-b036ec785dc8 could not be found. [ 1323.011793] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Traceback (most recent call last): [ 1323.011793] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] [ 1323.011793] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1323.011793] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] return getattr(target, method)(*args, **kwargs) [ 1323.011793] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] [ 1323.011793] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1323.011793] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] return fn(self, *args, **kwargs) [ 1323.011793] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] [ 1323.011793] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1323.011793] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] old_ref, inst_ref = db.instance_update_and_get_original( [ 1323.011793] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] [ 1323.011793] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1323.011793] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] return f(*args, **kwargs) [ 1323.011793] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] [ 1323.012165] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1323.012165] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] with excutils.save_and_reraise_exception() as ectxt: [ 1323.012165] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] [ 1323.012165] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1323.012165] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] self.force_reraise() [ 1323.012165] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] [ 1323.012165] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1323.012165] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] raise self.value [ 1323.012165] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] [ 1323.012165] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1323.012165] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] return f(*args, **kwargs) [ 1323.012165] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] [ 1323.012165] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1323.012165] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] return f(context, *args, **kwargs) [ 1323.012165] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] [ 1323.012533] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1323.012533] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1323.012533] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] [ 1323.012533] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1323.012533] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] raise exception.InstanceNotFound(instance_id=uuid) [ 1323.012533] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] [ 1323.012533] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] nova.exception.InstanceNotFound: Instance e924a9ab-71c1-4efe-a217-b036ec785dc8 could not be found. [ 1323.012533] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] [ 1323.012533] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] [ 1323.012533] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] During handling of the above exception, another exception occurred: [ 1323.012533] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] [ 1323.012533] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Traceback (most recent call last): [ 1323.012533] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1323.012533] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] ret = obj(*args, **kwargs) [ 1323.012533] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1323.012953] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] exception_handler_v20(status_code, error_body) [ 1323.012953] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1323.012953] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] raise client_exc(message=error_message, [ 1323.012953] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1323.012953] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Neutron server returns request_ids: ['req-91ef5683-6fc1-48bf-8383-94be415aee22'] [ 1323.012953] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] [ 1323.012953] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] During handling of the above exception, another exception occurred: [ 1323.012953] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] [ 1323.012953] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Traceback (most recent call last): [ 1323.012953] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1323.012953] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] self._deallocate_network(context, instance, requested_networks) [ 1323.012953] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1323.012953] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] self.network_api.deallocate_for_instance( [ 1323.013343] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/network/neutron.py", line 1798, in deallocate_for_instance [ 1323.013343] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] data = neutron.list_ports(**search_opts) [ 1323.013343] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1323.013343] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] ret = obj(*args, **kwargs) [ 1323.013343] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1323.013343] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] return self.list('ports', self.ports_path, retrieve_all, [ 1323.013343] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1323.013343] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] ret = obj(*args, **kwargs) [ 1323.013343] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1323.013343] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] for r in self._pagination(collection, path, **params): [ 1323.013343] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1323.013343] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] res = self.get(path, params=params) [ 1323.013343] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1323.013721] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] ret = obj(*args, **kwargs) [ 1323.013721] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1323.013721] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] return self.retry_request("GET", action, body=body, [ 1323.013721] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1323.013721] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] ret = obj(*args, **kwargs) [ 1323.013721] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1323.013721] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] return self.do_request(method, action, body=body, [ 1323.013721] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1323.013721] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] ret = obj(*args, **kwargs) [ 1323.013721] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1323.013721] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] self._handle_fault_response(status_code, replybody, resp) [ 1323.013721] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1323.013721] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] raise exception.Unauthorized() [ 1323.014110] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] nova.exception.Unauthorized: Not authorized. [ 1323.014110] env[60400]: ERROR nova.compute.manager [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] [ 1323.031159] env[60400]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Lock "e924a9ab-71c1-4efe-a217-b036ec785dc8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 340.029s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1366.141455] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1367.933470] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1368.933583] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1368.933978] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Starting heal instance info cache {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 1368.933978] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Rebuilding the list of instances to heal {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 1368.944562] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Skipping network cache update for instance because it is Building. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1368.944752] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Didn't find any instances for network info cache update. {{(pid=60400) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 1368.945143] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1369.376773] env[60400]: WARNING oslo_vmware.rw_handles [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1369.376773] env[60400]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1369.376773] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1369.376773] env[60400]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1369.376773] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1369.376773] env[60400]: ERROR oslo_vmware.rw_handles response.begin() [ 1369.376773] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1369.376773] env[60400]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1369.376773] env[60400]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1369.376773] env[60400]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1369.376773] env[60400]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1369.376773] env[60400]: ERROR oslo_vmware.rw_handles [ 1369.377216] env[60400]: DEBUG nova.virt.vmwareapi.images [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Downloaded image file data f5dfd970-7a56-4489-873c-2c3b6fbd9fe9 to vmware_temp/2c2881f0-13c7-40ff-8ea7-2cfa44fbf0b8/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk on the data store datastore1 {{(pid=60400) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1369.379190] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Caching image {{(pid=60400) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1369.379467] env[60400]: DEBUG nova.virt.vmwareapi.vm_util [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Copying Virtual Disk [datastore1] vmware_temp/2c2881f0-13c7-40ff-8ea7-2cfa44fbf0b8/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/tmp-sparse.vmdk to [datastore1] vmware_temp/2c2881f0-13c7-40ff-8ea7-2cfa44fbf0b8/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk {{(pid=60400) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1369.379844] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a678e89f-f930-4f1c-9460-1ae093620e77 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1369.388213] env[60400]: DEBUG oslo_vmware.api [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Waiting for the task: (returnval){ [ 1369.388213] env[60400]: value = "task-449906" [ 1369.388213] env[60400]: _type = "Task" [ 1369.388213] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1369.396068] env[60400]: DEBUG oslo_vmware.api [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Task: {'id': task-449906, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1369.898662] env[60400]: DEBUG oslo_vmware.exceptions [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Fault InvalidArgument not matched. {{(pid=60400) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1369.898910] env[60400]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Releasing lock "[datastore1] devstack-image-cache_base/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9/f5dfd970-7a56-4489-873c-2c3b6fbd9fe9.vmdk" {{(pid=60400) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1369.899486] env[60400]: ERROR nova.compute.manager [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1369.899486] env[60400]: Faults: ['InvalidArgument'] [ 1369.899486] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Traceback (most recent call last): [ 1369.899486] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1369.899486] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] yield resources [ 1369.899486] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1369.899486] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] self.driver.spawn(context, instance, image_meta, [ 1369.899486] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1369.899486] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1369.899486] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1369.899486] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] self._fetch_image_if_missing(context, vi) [ 1369.899486] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1369.900117] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] image_cache(vi, tmp_image_ds_loc) [ 1369.900117] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1369.900117] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] vm_util.copy_virtual_disk( [ 1369.900117] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1369.900117] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] session._wait_for_task(vmdk_copy_task) [ 1369.900117] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1369.900117] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] return self.wait_for_task(task_ref) [ 1369.900117] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1369.900117] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] return evt.wait() [ 1369.900117] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1369.900117] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] result = hub.switch() [ 1369.900117] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1369.900117] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] return self.greenlet.switch() [ 1369.900694] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1369.900694] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] self.f(*self.args, **self.kw) [ 1369.900694] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1369.900694] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] raise exceptions.translate_fault(task_info.error) [ 1369.900694] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1369.900694] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Faults: ['InvalidArgument'] [ 1369.900694] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] [ 1369.900694] env[60400]: INFO nova.compute.manager [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Terminating instance [ 1369.902611] env[60400]: DEBUG nova.compute.manager [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Start destroying the instance on the hypervisor. {{(pid=60400) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1369.902797] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Destroying instance {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1369.903521] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9eb2159a-ba2d-40b6-a71b-c85aa10659be {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1369.910045] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Unregistering the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1369.910252] env[60400]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-5f0d1173-bcb8-43b5-a916-3db2fe2cf06d {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1369.932084] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1369.932277] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1369.973171] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Unregistered the VM {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1369.973538] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Deleting contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1369.973538] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Deleting the datastore file [datastore1] f114d70b-3524-4f1c-b1af-71ae3235d040 {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1369.973808] env[60400]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2d574d92-f161-4a6b-92c6-e0eae1cdae35 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1369.980554] env[60400]: DEBUG oslo_vmware.api [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Waiting for the task: (returnval){ [ 1369.980554] env[60400]: value = "task-449908" [ 1369.980554] env[60400]: _type = "Task" [ 1369.980554] env[60400]: } to complete. {{(pid=60400) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1369.987916] env[60400]: DEBUG oslo_vmware.api [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Task: {'id': task-449908, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1370.490666] env[60400]: DEBUG oslo_vmware.api [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Task: {'id': task-449908, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.059662} completed successfully. {{(pid=60400) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1370.490903] env[60400]: DEBUG nova.virt.vmwareapi.ds_util [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Deleted the datastore file {{(pid=60400) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1370.491096] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Deleted contents of the VM from datastore datastore1 {{(pid=60400) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1370.491264] env[60400]: DEBUG nova.virt.vmwareapi.vmops [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Instance destroyed {{(pid=60400) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1370.491432] env[60400]: INFO nova.compute.manager [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Took 0.59 seconds to destroy the instance on the hypervisor. [ 1370.493551] env[60400]: DEBUG nova.compute.claims [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Aborting claim: {{(pid=60400) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1370.493714] env[60400]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1370.493927] env[60400]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1370.555042] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51e45346-a198-43ae-900b-47c6cfdb640f {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1370.562378] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e65fa77-d311-4fcd-ab5e-76d7e1536578 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1371.214993] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1371.216141] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc81fcda-4db0-41cf-8725-ca88afa4ba7b {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1371.224015] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2734271-f962-4e2f-894c-66b043d8afab {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1371.236935] env[60400]: DEBUG nova.compute.provider_tree [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1371.245022] env[60400]: DEBUG nova.scheduler.client.report [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1371.257791] env[60400]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.764s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1371.258332] env[60400]: ERROR nova.compute.manager [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1371.258332] env[60400]: Faults: ['InvalidArgument'] [ 1371.258332] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Traceback (most recent call last): [ 1371.258332] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1371.258332] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] self.driver.spawn(context, instance, image_meta, [ 1371.258332] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1371.258332] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1371.258332] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1371.258332] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] self._fetch_image_if_missing(context, vi) [ 1371.258332] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1371.258332] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] image_cache(vi, tmp_image_ds_loc) [ 1371.258332] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1371.258653] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] vm_util.copy_virtual_disk( [ 1371.258653] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1371.258653] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] session._wait_for_task(vmdk_copy_task) [ 1371.258653] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1371.258653] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] return self.wait_for_task(task_ref) [ 1371.258653] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1371.258653] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] return evt.wait() [ 1371.258653] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1371.258653] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] result = hub.switch() [ 1371.258653] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1371.258653] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] return self.greenlet.switch() [ 1371.258653] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1371.258653] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] self.f(*self.args, **self.kw) [ 1371.258965] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1371.258965] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] raise exceptions.translate_fault(task_info.error) [ 1371.258965] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1371.258965] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Faults: ['InvalidArgument'] [ 1371.258965] env[60400]: ERROR nova.compute.manager [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] [ 1371.259114] env[60400]: DEBUG nova.compute.utils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] VimFaultException {{(pid=60400) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1371.260355] env[60400]: DEBUG nova.compute.manager [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Build of instance f114d70b-3524-4f1c-b1af-71ae3235d040 was re-scheduled: A specified parameter was not correct: fileType [ 1371.260355] env[60400]: Faults: ['InvalidArgument'] {{(pid=60400) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1371.260806] env[60400]: DEBUG nova.compute.manager [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Unplugging VIFs for instance {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1371.260970] env[60400]: DEBUG nova.compute.manager [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60400) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1371.261146] env[60400]: DEBUG nova.compute.manager [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Deallocating network for instance {{(pid=60400) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1371.261298] env[60400]: DEBUG nova.network.neutron [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] deallocate_for_instance() {{(pid=60400) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 1371.485090] env[60400]: DEBUG nova.network.neutron [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Updating instance_info_cache with network_info: [] {{(pid=60400) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1371.497728] env[60400]: INFO nova.compute.manager [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Took 0.24 seconds to deallocate network for instance. [ 1371.609028] env[60400]: INFO nova.scheduler.client.report [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Deleted allocations for instance f114d70b-3524-4f1c-b1af-71ae3235d040 [ 1371.624053] env[60400]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Lock "f114d70b-3524-4f1c-b1af-71ae3235d040" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 179.244s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1372.933350] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1372.933689] env[60400]: DEBUG nova.compute.manager [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60400) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 1374.932875] env[60400]: DEBUG oslo_service.periodic_task [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60400) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1374.945813] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1374.946033] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1374.946195] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1374.946346] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60400) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1374.947421] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fcc5f18b-1497-4fe6-9086-737f7f7acbd5 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1374.956065] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-611fd8c7-2638-4369-9641-8a49c4d21775 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1374.970614] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24dce261-b2bf-424c-9cb2-f597bb792f45 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1374.976961] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40cd6640-770a-4021-9f6b-bdde399d055d {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1375.006830] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181629MB free_disk=118GB free_vcpus=48 pci_devices=None {{(pid=60400) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1375.006988] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1375.007193] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1375.041668] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=60400) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1375.041832] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=60400) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1375.057397] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b99a124-43e8-423f-bccc-ef87c500c453 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1375.064830] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2ac2032-ff84-4573-9f49-456e216d0629 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1375.094074] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-136778b0-0228-4011-87eb-4cab26f8dde8 {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1375.101283] env[60400]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7f84ed1-f1bc-40a6-9cb7-eb2cbd3051df {{(pid=60400) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1375.114306] env[60400]: DEBUG nova.compute.provider_tree [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Inventory has not changed in ProviderTree for provider: a29934a0-6a74-4b6e-8edf-44d7a53db1dc {{(pid=60400) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1375.123094] env[60400]: DEBUG nova.scheduler.client.report [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Inventory has not changed for provider a29934a0-6a74-4b6e-8edf-44d7a53db1dc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 118, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60400) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1375.135596] env[60400]: DEBUG nova.compute.resource_tracker [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60400) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1375.135801] env[60400]: DEBUG oslo_concurrency.lockutils [None req-a331482a-092f-47bc-a84f-168a2aafbb87 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.129s {{(pid=60400) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}}