[ 632.686455] env[59528]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 633.324105] env[59577]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 634.892990] env[59577]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=59577) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 634.893373] env[59577]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=59577) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 634.893454] env[59577]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=59577) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 634.893776] env[59577]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 634.894944] env[59577]: WARNING nova.servicegroup.api [-] Report interval must be less than service down time. Current config: . Setting service_down_time to: 300 [ 635.024115] env[59577]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=59577) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} [ 635.034439] env[59577]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.010s {{(pid=59577) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} [ 635.139025] env[59577]: INFO nova.virt.driver [None req-e3e51f13-0be7-42d0-9070-0d8ea83aad6b None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 635.219158] env[59577]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 635.219330] env[59577]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 635.219468] env[59577]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=59577) __init__ /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:242}} [ 638.450144] env[59577]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-c07ebb80-304f-4935-80c4-4fe94d1bdca4 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.465490] env[59577]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=59577) _create_session /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:242}} [ 638.465643] env[59577]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-068cb76d-e3f3-421e-8fa0-b0d8cd9481d5 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.498486] env[59577]: INFO oslo_vmware.api [-] Successfully established new session; session ID is b0c73. [ 638.498616] env[59577]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 3.279s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 638.499267] env[59577]: INFO nova.virt.vmwareapi.driver [None req-e3e51f13-0be7-42d0-9070-0d8ea83aad6b None None] VMware vCenter version: 7.0.3 [ 638.502716] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f59f8db0-3823-4043-9ef3-330425960938 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.519894] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9184f48a-c542-498a-b9ab-d7251be4b4d1 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.525783] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e3e4a7f-ca6c-494c-b942-316b69878be0 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.532283] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afd3aa07-01fc-4a4c-9e16-ef4b5de3fe60 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.546015] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc450c82-78df-4530-9426-7bce06a9eb73 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.551819] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-796f25af-1a7e-45c4-a39d-f8f76de0052b {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.580967] env[59577]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-8c119802-17c6-47ba-be49-f56611c0499a {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.586186] env[59577]: DEBUG nova.virt.vmwareapi.driver [None req-e3e51f13-0be7-42d0-9070-0d8ea83aad6b None None] Extension org.openstack.compute already exists. {{(pid=59577) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:214}} [ 638.588858] env[59577]: INFO nova.compute.provider_config [None req-e3e51f13-0be7-42d0-9070-0d8ea83aad6b None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 638.605625] env[59577]: DEBUG nova.context [None req-e3e51f13-0be7-42d0-9070-0d8ea83aad6b None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),53993b2e-1679-4e82-8d02-931b67b1836d(cell1) {{(pid=59577) load_cells /opt/stack/nova/nova/context.py:464}} [ 638.607536] env[59577]: DEBUG oslo_concurrency.lockutils [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 638.607781] env[59577]: DEBUG oslo_concurrency.lockutils [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 638.608515] env[59577]: DEBUG oslo_concurrency.lockutils [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 638.608853] env[59577]: DEBUG oslo_concurrency.lockutils [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] Acquiring lock "53993b2e-1679-4e82-8d02-931b67b1836d" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 638.609052] env[59577]: DEBUG oslo_concurrency.lockutils [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] Lock "53993b2e-1679-4e82-8d02-931b67b1836d" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 638.610058] env[59577]: DEBUG oslo_concurrency.lockutils [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] Lock "53993b2e-1679-4e82-8d02-931b67b1836d" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 638.622139] env[59577]: DEBUG oslo_db.sqlalchemy.engines [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=59577) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 638.627284] env[59577]: ERROR nova.db.main.api [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 638.627284] env[59577]: result = function(*args, **kwargs) [ 638.627284] env[59577]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 638.627284] env[59577]: return func(*args, **kwargs) [ 638.627284] env[59577]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 638.627284] env[59577]: result = fn(*args, **kwargs) [ 638.627284] env[59577]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 638.627284] env[59577]: return f(*args, **kwargs) [ 638.627284] env[59577]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 638.627284] env[59577]: return db.service_get_minimum_version(context, binaries) [ 638.627284] env[59577]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 638.627284] env[59577]: _check_db_access() [ 638.627284] env[59577]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 638.627284] env[59577]: stacktrace = ''.join(traceback.format_stack()) [ 638.627284] env[59577]: [ 638.628883] env[59577]: DEBUG oslo_db.sqlalchemy.engines [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=59577) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 638.631557] env[59577]: ERROR nova.db.main.api [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 638.631557] env[59577]: result = function(*args, **kwargs) [ 638.631557] env[59577]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 638.631557] env[59577]: return func(*args, **kwargs) [ 638.631557] env[59577]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 638.631557] env[59577]: result = fn(*args, **kwargs) [ 638.631557] env[59577]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 638.631557] env[59577]: return f(*args, **kwargs) [ 638.631557] env[59577]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 638.631557] env[59577]: return db.service_get_minimum_version(context, binaries) [ 638.631557] env[59577]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 638.631557] env[59577]: _check_db_access() [ 638.631557] env[59577]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 638.631557] env[59577]: stacktrace = ''.join(traceback.format_stack()) [ 638.631557] env[59577]: [ 638.631918] env[59577]: WARNING nova.objects.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] Failed to get minimum service version for cell 53993b2e-1679-4e82-8d02-931b67b1836d [ 638.632069] env[59577]: WARNING nova.objects.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 638.632481] env[59577]: DEBUG oslo_concurrency.lockutils [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] Acquiring lock "singleton_lock" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 638.632637] env[59577]: DEBUG oslo_concurrency.lockutils [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] Acquired lock "singleton_lock" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 638.632887] env[59577]: DEBUG oslo_concurrency.lockutils [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] Releasing lock "singleton_lock" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 638.633210] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] Full set of CONF: {{(pid=59577) _wait_for_exit_or_signal /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:362}} [ 638.633355] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ******************************************************************************** {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} [ 638.633484] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] Configuration options gathered from: {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} [ 638.633619] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} [ 638.633822] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} [ 638.633966] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ================================================================================ {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} [ 638.634192] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] allow_resize_to_same_host = True {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.634361] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] arq_binding_timeout = 300 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.634494] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] backdoor_port = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.634620] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] backdoor_socket = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.634784] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] block_device_allocate_retries = 60 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.635010] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] block_device_allocate_retries_interval = 3 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.635206] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cert = self.pem {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.635376] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.635548] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] compute_monitors = [] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.635710] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] config_dir = [] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.635882] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] config_drive_format = iso9660 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.636026] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.636191] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] config_source = [] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.636359] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] console_host = devstack {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.636525] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] control_exchange = nova {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.636683] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cpu_allocation_ratio = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.636860] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] daemon = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.637055] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] debug = True {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.637218] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] default_access_ip_network_name = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.637383] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] default_availability_zone = nova {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.637537] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] default_ephemeral_format = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.637798] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.637966] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] default_schedule_zone = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.638142] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] disk_allocation_ratio = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.638302] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] enable_new_services = True {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.638478] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] enabled_apis = ['osapi_compute'] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.638667] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] enabled_ssl_apis = [] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.638848] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] flat_injected = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.639055] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] force_config_drive = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.639281] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] force_raw_images = True {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.639463] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] graceful_shutdown_timeout = 5 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.639628] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] heal_instance_info_cache_interval = 60 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.639844] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] host = cpu-1 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.640018] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.640183] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] initial_disk_allocation_ratio = 1.0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.640344] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] initial_ram_allocation_ratio = 1.0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.640553] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.640718] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] instance_build_timeout = 0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.640874] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] instance_delete_interval = 300 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.641051] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] instance_format = [instance: %(uuid)s] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.641222] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] instance_name_template = instance-%08x {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.641382] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] instance_usage_audit = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.641549] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] instance_usage_audit_period = month {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.641712] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.641879] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] instances_path = /opt/stack/data/nova/instances {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.642073] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] internal_service_availability_zone = internal {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.642245] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] key = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.642407] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] live_migration_retry_count = 30 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.642567] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] log_config_append = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.642734] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.642889] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] log_dir = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.643055] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] log_file = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.643184] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] log_options = True {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.643343] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] log_rotate_interval = 1 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.643507] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] log_rotate_interval_type = days {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.643670] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] log_rotation_type = none {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.643802] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.643926] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.644102] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.644268] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.644393] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.644551] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] long_rpc_timeout = 1800 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.644709] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] max_concurrent_builds = 10 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.644864] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] max_concurrent_live_migrations = 1 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.645032] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] max_concurrent_snapshots = 5 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.645219] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] max_local_block_devices = 3 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.645387] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] max_logfile_count = 30 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.645540] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] max_logfile_size_mb = 200 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.645694] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] maximum_instance_delete_attempts = 5 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.645860] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] metadata_listen = 0.0.0.0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.646035] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] metadata_listen_port = 8775 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.646205] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] metadata_workers = 2 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.646362] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] migrate_max_retries = -1 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.646528] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] mkisofs_cmd = genisoimage {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.646731] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] my_block_storage_ip = 10.180.1.21 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.646863] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] my_ip = 10.180.1.21 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.647031] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] network_allocate_retries = 0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.647210] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.647374] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] osapi_compute_listen = 0.0.0.0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.647535] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] osapi_compute_listen_port = 8774 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.647729] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] osapi_compute_unique_server_name_scope = {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.647903] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] osapi_compute_workers = 2 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.648089] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] password_length = 12 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.648267] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] periodic_enable = True {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.648430] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] periodic_fuzzy_delay = 60 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.648623] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] pointer_model = usbtablet {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.648808] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] preallocate_images = none {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.648968] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] publish_errors = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.649109] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] pybasedir = /opt/stack/nova {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.649266] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ram_allocation_ratio = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.649423] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] rate_limit_burst = 0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.649588] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] rate_limit_except_level = CRITICAL {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.649747] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] rate_limit_interval = 0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.649906] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] reboot_timeout = 0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.650070] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] reclaim_instance_interval = 0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.650226] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] record = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.650389] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] reimage_timeout_per_gb = 60 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.650551] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] report_interval = 120 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.650710] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] rescue_timeout = 0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.650911] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] reserved_host_cpus = 0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.651116] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] reserved_host_disk_mb = 0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.651287] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] reserved_host_memory_mb = 512 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.651446] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] reserved_huge_pages = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.651605] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] resize_confirm_window = 0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.651765] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] resize_fs_using_block_device = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.651922] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] resume_guests_state_on_host_boot = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.652110] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.652274] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] rpc_response_timeout = 60 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.652436] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] run_external_periodic_tasks = True {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.652604] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] running_deleted_instance_action = reap {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.652768] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] running_deleted_instance_poll_interval = 1800 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.652922] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] running_deleted_instance_timeout = 0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.653091] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] scheduler_instance_sync_interval = 120 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.653227] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] service_down_time = 300 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.653397] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] servicegroup_driver = db {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.653557] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] shelved_offload_time = 0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.653715] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] shelved_poll_interval = 3600 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.653900] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] shutdown_timeout = 0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.654088] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] source_is_ipv6 = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.654258] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ssl_only = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.654505] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.654673] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] sync_power_state_interval = 600 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.654835] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] sync_power_state_pool_size = 1000 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.655009] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] syslog_log_facility = LOG_USER {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.655174] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] tempdir = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.655333] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] timeout_nbd = 10 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.655500] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] transport_url = **** {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.655661] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] update_resources_interval = 0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.655820] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] use_cow_images = True {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.655976] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] use_eventlog = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.656144] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] use_journal = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.656302] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] use_json = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.656457] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] use_rootwrap_daemon = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.656613] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] use_stderr = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.656768] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] use_syslog = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.656954] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vcpu_pin_set = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.657181] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vif_plugging_is_fatal = True {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.657359] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vif_plugging_timeout = 300 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.657526] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] virt_mkfs = [] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.657713] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] volume_usage_poll_interval = 0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.657881] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] watch_log_file = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.658061] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] web = /usr/share/spice-html5 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 638.658260] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_concurrency.disable_process_locking = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.658559] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.658761] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.658934] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.659122] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.659297] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.659465] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.659647] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api.auth_strategy = keystone {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.659822] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api.compute_link_prefix = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.660011] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.660199] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api.dhcp_domain = novalocal {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.660368] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api.enable_instance_password = True {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.660530] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api.glance_link_prefix = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.660696] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.660867] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.661040] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api.instance_list_per_project_cells = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.661207] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api.list_records_by_skipping_down_cells = True {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.661367] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api.local_metadata_per_cell = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.661533] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api.max_limit = 1000 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.661729] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api.metadata_cache_expiration = 15 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.661915] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api.neutron_default_tenant_id = default {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.662100] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api.use_forwarded_for = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.662279] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api.use_neutron_default_nets = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.662442] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.662604] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.662772] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.662943] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.663124] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api.vendordata_dynamic_targets = [] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.663290] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api.vendordata_jsonfile_path = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.663473] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.663674] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cache.backend = dogpile.cache.memcached {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.663842] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cache.backend_argument = **** {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.664021] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cache.config_prefix = cache.oslo {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.664194] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cache.dead_timeout = 60.0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.664359] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cache.debug_cache_backend = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.664520] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cache.enable_retry_client = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.664682] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cache.enable_socket_keepalive = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.664855] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cache.enabled = True {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.665030] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cache.expiration_time = 600 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.665199] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cache.hashclient_retry_attempts = 2 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.665364] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cache.hashclient_retry_delay = 1.0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.665527] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cache.memcache_dead_retry = 300 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.665715] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cache.memcache_password = {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.665854] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.666043] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.666222] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cache.memcache_pool_maxsize = 10 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.666384] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.666546] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cache.memcache_sasl_enabled = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.666727] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.666893] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cache.memcache_socket_timeout = 1.0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.667074] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cache.memcache_username = {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.667251] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cache.proxies = [] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.667417] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cache.retry_attempts = 2 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.667592] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cache.retry_delay = 0.0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.667742] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cache.socket_keepalive_count = 1 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.667904] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cache.socket_keepalive_idle = 1 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.668074] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cache.socket_keepalive_interval = 1 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.668233] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cache.tls_allowed_ciphers = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.668389] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cache.tls_cafile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.668551] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cache.tls_certfile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.668757] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cache.tls_enabled = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.668921] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cache.tls_keyfile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.669107] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cinder.auth_section = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.669289] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cinder.auth_type = password {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.669452] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cinder.cafile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.669629] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cinder.catalog_info = volumev3::publicURL {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.669793] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cinder.certfile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.669958] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cinder.collect_timing = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.670137] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cinder.cross_az_attach = True {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.670300] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cinder.debug = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.670458] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cinder.endpoint_template = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.670623] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cinder.http_retries = 3 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.670785] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cinder.insecure = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.670940] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cinder.keyfile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.671123] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cinder.os_region_name = RegionOne {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.671286] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cinder.split_loggers = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.671442] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cinder.timeout = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.671611] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.671770] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] compute.cpu_dedicated_set = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.671925] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] compute.cpu_shared_set = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.672097] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] compute.image_type_exclude_list = [] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.672261] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.672424] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] compute.max_concurrent_disk_ops = 0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.672584] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] compute.max_disk_devices_to_attach = -1 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.672747] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.672915] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.673088] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] compute.resource_provider_association_refresh = 300 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.673251] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] compute.shutdown_retry_interval = 10 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.673428] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.673605] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] conductor.workers = 2 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.673783] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] console.allowed_origins = [] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.673941] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] console.ssl_ciphers = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.674421] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] console.ssl_minimum_version = default {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.674421] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] consoleauth.token_ttl = 600 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.674487] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cyborg.cafile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.674604] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cyborg.certfile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.674766] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cyborg.collect_timing = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.674923] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cyborg.connect_retries = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.675089] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cyborg.connect_retry_delay = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.675246] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cyborg.endpoint_override = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.675408] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cyborg.insecure = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.675562] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cyborg.keyfile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.675720] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cyborg.max_version = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.675878] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cyborg.min_version = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.676040] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cyborg.region_name = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.676198] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cyborg.service_name = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.676364] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cyborg.service_type = accelerator {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.676524] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cyborg.split_loggers = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.676682] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cyborg.status_code_retries = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.676838] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cyborg.status_code_retry_delay = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.676995] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cyborg.timeout = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.677187] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.677346] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] cyborg.version = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.677527] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] database.backend = sqlalchemy {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.677733] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] database.connection = **** {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.677908] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] database.connection_debug = 0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.678091] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] database.connection_parameters = {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.678260] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] database.connection_recycle_time = 3600 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.678426] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] database.connection_trace = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.678621] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] database.db_inc_retry_interval = True {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.678812] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] database.db_max_retries = 20 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.678980] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] database.db_max_retry_interval = 10 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.679158] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] database.db_retry_interval = 1 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.679329] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] database.max_overflow = 50 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.679493] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] database.max_pool_size = 5 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.679663] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] database.max_retries = 10 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.679827] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] database.mysql_enable_ndb = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.679997] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.680196] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] database.mysql_wsrep_sync_wait = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.680325] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] database.pool_timeout = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.680493] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] database.retry_interval = 10 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.680652] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] database.slave_connection = **** {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.680818] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] database.sqlite_synchronous = True {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.680981] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] database.use_db_reconnect = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.681177] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api_database.backend = sqlalchemy {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.681355] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api_database.connection = **** {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.681525] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api_database.connection_debug = 0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.681695] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api_database.connection_parameters = {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.681860] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api_database.connection_recycle_time = 3600 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.682036] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api_database.connection_trace = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.682205] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api_database.db_inc_retry_interval = True {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.682368] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api_database.db_max_retries = 20 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.682528] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api_database.db_max_retry_interval = 10 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.682742] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api_database.db_retry_interval = 1 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.682939] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api_database.max_overflow = 50 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.683119] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api_database.max_pool_size = 5 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.683291] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api_database.max_retries = 10 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.683455] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api_database.mysql_enable_ndb = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.683625] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.683788] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.683951] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api_database.pool_timeout = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.685637] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api_database.retry_interval = 10 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.685826] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api_database.slave_connection = **** {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.686012] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] api_database.sqlite_synchronous = True {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.686199] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] devices.enabled_mdev_types = [] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.686382] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.686550] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ephemeral_storage_encryption.enabled = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.686713] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.686888] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] glance.api_servers = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.687064] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] glance.cafile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.687232] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] glance.certfile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.687396] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] glance.collect_timing = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.687560] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] glance.connect_retries = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.687741] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] glance.connect_retry_delay = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.687912] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] glance.debug = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.688092] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] glance.default_trusted_certificate_ids = [] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.688261] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] glance.enable_certificate_validation = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.688427] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] glance.enable_rbd_download = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.688610] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] glance.endpoint_override = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.688798] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] glance.insecure = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.688964] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] glance.keyfile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.689138] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] glance.max_version = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.689297] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] glance.min_version = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.689458] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] glance.num_retries = 3 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.689629] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] glance.rbd_ceph_conf = {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.689796] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] glance.rbd_connect_timeout = 5 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.689970] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] glance.rbd_pool = {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.690153] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] glance.rbd_user = {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.690316] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] glance.region_name = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.690477] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] glance.service_name = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.690649] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] glance.service_type = image {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.690814] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] glance.split_loggers = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.690971] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] glance.status_code_retries = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.691144] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] glance.status_code_retry_delay = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.691302] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] glance.timeout = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.691484] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.691650] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] glance.verify_glance_signatures = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.691811] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] glance.version = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.691980] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] guestfs.debug = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.692167] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] hyperv.config_drive_cdrom = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.692333] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] hyperv.config_drive_inject_password = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.692499] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.692664] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] hyperv.enable_instance_metrics_collection = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.692829] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] hyperv.enable_remotefx = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.692998] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] hyperv.instances_path_share = {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.693179] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] hyperv.iscsi_initiator_list = [] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.693339] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] hyperv.limit_cpu_features = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.693504] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.693665] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.693836] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] hyperv.power_state_check_timeframe = 60 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.693998] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.694180] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.694344] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] hyperv.use_multipath_io = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.694507] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] hyperv.volume_attach_retry_count = 10 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.694668] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.694828] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] hyperv.vswitch_name = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.694989] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.695172] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] mks.enabled = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.695523] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.695714] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] image_cache.manager_interval = 2400 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.695885] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] image_cache.precache_concurrency = 1 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.696070] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] image_cache.remove_unused_base_images = True {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.696242] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.696410] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.696586] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] image_cache.subdirectory_name = _base {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.696765] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ironic.api_max_retries = 60 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.696929] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ironic.api_retry_interval = 2 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.697099] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ironic.auth_section = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.697265] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ironic.auth_type = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.697426] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ironic.cafile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.697603] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ironic.certfile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.697826] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ironic.collect_timing = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.698016] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ironic.connect_retries = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.698181] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ironic.connect_retry_delay = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.698339] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ironic.endpoint_override = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.698503] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ironic.insecure = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.698673] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ironic.keyfile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.698835] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ironic.max_version = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.698992] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ironic.min_version = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.699165] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ironic.partition_key = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.699328] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ironic.peer_list = [] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.699486] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ironic.region_name = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.699648] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ironic.serial_console_state_timeout = 10 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.699808] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ironic.service_name = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.699977] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ironic.service_type = baremetal {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.700176] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ironic.split_loggers = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.700309] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ironic.status_code_retries = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.700464] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ironic.status_code_retry_delay = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.700620] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ironic.timeout = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.700807] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.700968] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ironic.version = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.701164] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.701338] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] key_manager.fixed_key = **** {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.701558] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.701811] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] barbican.barbican_api_version = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.701857] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] barbican.barbican_endpoint = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.702033] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] barbican.barbican_endpoint_type = public {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.702201] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] barbican.barbican_region_name = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.702360] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] barbican.cafile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.702518] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] barbican.certfile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.702706] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] barbican.collect_timing = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.702888] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] barbican.insecure = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.703073] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] barbican.keyfile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.703878] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] barbican.number_of_retries = 60 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.703878] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] barbican.retry_delay = 1 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.703878] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] barbican.send_service_user_token = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.703878] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] barbican.split_loggers = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.703878] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] barbican.timeout = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.704080] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] barbican.verify_ssl = True {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.704205] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] barbican.verify_ssl_path = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.704400] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] barbican_service_user.auth_section = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.704594] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] barbican_service_user.auth_type = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.704724] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] barbican_service_user.cafile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.704881] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] barbican_service_user.certfile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.705056] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] barbican_service_user.collect_timing = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.705220] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] barbican_service_user.insecure = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.705377] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] barbican_service_user.keyfile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.705541] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] barbican_service_user.split_loggers = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.705735] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] barbican_service_user.timeout = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.705911] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vault.approle_role_id = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.706118] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vault.approle_secret_id = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.706241] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vault.cafile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.706396] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vault.certfile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.706559] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vault.collect_timing = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.706721] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vault.insecure = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.706878] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vault.keyfile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.707062] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vault.kv_mountpoint = secret {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.707230] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vault.kv_version = 2 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.707389] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vault.namespace = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.707545] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vault.root_token_id = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.707731] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vault.split_loggers = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.707892] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vault.ssl_ca_crt_file = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.708059] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vault.timeout = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.708223] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vault.use_ssl = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.708389] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.708554] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] keystone.cafile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.708729] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] keystone.certfile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.708904] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] keystone.collect_timing = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.709074] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] keystone.connect_retries = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.709236] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] keystone.connect_retry_delay = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.709395] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] keystone.endpoint_override = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.709553] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] keystone.insecure = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.709714] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] keystone.keyfile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.709870] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] keystone.max_version = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.710033] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] keystone.min_version = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.710194] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] keystone.region_name = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.710350] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] keystone.service_name = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.710525] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] keystone.service_type = identity {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.710691] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] keystone.split_loggers = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.710870] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] keystone.status_code_retries = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.711045] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] keystone.status_code_retry_delay = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.711208] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] keystone.timeout = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.711392] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.711554] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] keystone.version = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.711757] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.connection_uri = {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.711923] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.cpu_mode = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.712098] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.cpu_model_extra_flags = [] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.712269] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.cpu_models = [] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.712440] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.cpu_power_governor_high = performance {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.712608] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.cpu_power_governor_low = powersave {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.712774] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.cpu_power_management = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.712949] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.713126] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.device_detach_attempts = 8 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.713292] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.device_detach_timeout = 20 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.713460] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.disk_cachemodes = [] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.713622] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.disk_prefix = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.713787] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.enabled_perf_events = [] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.713954] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.file_backed_memory = 0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.714124] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.gid_maps = [] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.714285] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.hw_disk_discard = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.714445] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.hw_machine_type = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.714616] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.images_rbd_ceph_conf = {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.714784] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.714951] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.715129] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.images_rbd_glance_store_name = {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.715298] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.images_rbd_pool = rbd {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.715463] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.images_type = default {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.715639] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.images_volume_group = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.715814] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.inject_key = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.715977] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.inject_partition = -2 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.716151] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.inject_password = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.716313] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.iscsi_iface = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.716471] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.iser_use_multipath = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.716633] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.live_migration_bandwidth = 0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.716795] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.716956] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.live_migration_downtime = 500 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.717128] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.717292] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.717452] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.live_migration_inbound_addr = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.717646] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.717834] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.live_migration_permit_post_copy = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.718009] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.live_migration_scheme = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.718195] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.live_migration_timeout_action = abort {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.718364] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.live_migration_tunnelled = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.718523] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.live_migration_uri = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.718707] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.live_migration_with_native_tls = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.718882] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.max_queues = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.719058] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.719223] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.nfs_mount_options = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.719529] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.719702] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.719872] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.num_iser_scan_tries = 5 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.720042] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.num_memory_encrypted_guests = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.720211] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.720372] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.num_pcie_ports = 0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.720541] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.num_volume_scan_tries = 5 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.720706] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.pmem_namespaces = [] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.720865] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.quobyte_client_cfg = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.721167] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.721344] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.rbd_connect_timeout = 5 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.721513] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.721681] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.721844] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.rbd_secret_uuid = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.721999] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.rbd_user = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.722176] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.722348] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.remote_filesystem_transport = ssh {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.722509] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.rescue_image_id = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.722665] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.rescue_kernel_id = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.722845] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.rescue_ramdisk_id = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.723028] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.723194] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.rx_queue_size = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.723363] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.smbfs_mount_options = {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.723645] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.723820] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.snapshot_compression = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.723983] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.snapshot_image_format = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.724216] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.724386] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.sparse_logical_volumes = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.724551] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.swtpm_enabled = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.724724] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.swtpm_group = tss {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.724889] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.swtpm_user = tss {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.725073] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.sysinfo_serial = unique {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.725236] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.tx_queue_size = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.725405] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.uid_maps = [] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.725581] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.use_virtio_for_bridges = True {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.725773] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.virt_type = kvm {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.725947] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.volume_clear = zero {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.726125] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.volume_clear_size = 0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.726294] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.volume_use_multipath = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.726451] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.vzstorage_cache_path = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.726622] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.726793] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.vzstorage_mount_group = qemu {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.726958] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.vzstorage_mount_opts = [] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.727140] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.727419] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.727618] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.vzstorage_mount_user = stack {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.727802] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.727979] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] neutron.auth_section = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.728171] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] neutron.auth_type = password {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.728336] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] neutron.cafile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.728496] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] neutron.certfile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.728675] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] neutron.collect_timing = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.728849] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] neutron.connect_retries = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.729025] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] neutron.connect_retry_delay = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.729197] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] neutron.default_floating_pool = public {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.729360] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] neutron.endpoint_override = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.729525] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] neutron.extension_sync_interval = 600 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.729711] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] neutron.http_retries = 3 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.729896] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] neutron.insecure = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.730069] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] neutron.keyfile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.730234] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] neutron.max_version = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.730406] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.730567] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] neutron.min_version = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.730734] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] neutron.ovs_bridge = br-int {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.730901] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] neutron.physnets = [] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.731083] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] neutron.region_name = RegionOne {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.731257] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] neutron.service_metadata_proxy = True {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.731419] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] neutron.service_name = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.731590] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] neutron.service_type = network {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.731762] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] neutron.split_loggers = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.731923] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] neutron.status_code_retries = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.732092] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] neutron.status_code_retry_delay = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.732258] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] neutron.timeout = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.732443] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.732608] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] neutron.version = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.732786] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] notifications.bdms_in_notifications = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.732963] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] notifications.default_level = INFO {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.733156] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] notifications.notification_format = unversioned {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.733321] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] notifications.notify_on_state_change = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.733498] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.733676] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] pci.alias = [] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.733852] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] pci.device_spec = [] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.734028] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] pci.report_in_placement = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.734207] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.auth_section = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.734383] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.auth_type = password {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.734554] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.auth_url = http://10.180.1.21/identity {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.734720] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.cafile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.734910] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.certfile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.735091] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.collect_timing = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.735254] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.connect_retries = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.735414] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.connect_retry_delay = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.735589] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.default_domain_id = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.735764] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.default_domain_name = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.735925] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.domain_id = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.736096] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.domain_name = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.736260] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.endpoint_override = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.736491] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.insecure = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.736567] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.keyfile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.736724] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.max_version = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.736881] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.min_version = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.737061] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.password = **** {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.737227] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.project_domain_id = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.737394] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.project_domain_name = Default {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.737584] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.project_id = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.737812] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.project_name = service {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.738009] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.region_name = RegionOne {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.738182] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.service_name = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.738355] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.service_type = placement {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.738521] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.split_loggers = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.738681] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.status_code_retries = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.738844] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.status_code_retry_delay = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.739011] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.system_scope = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.739177] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.timeout = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.739337] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.trust_id = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.739494] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.user_domain_id = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.739665] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.user_domain_name = Default {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.739832] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.user_id = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.740019] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.username = placement {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.740209] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.740372] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] placement.version = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.740549] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] quota.cores = 20 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.740713] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] quota.count_usage_from_placement = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.740887] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.741073] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] quota.injected_file_content_bytes = 10240 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.741244] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] quota.injected_file_path_length = 255 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.741411] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] quota.injected_files = 5 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.741575] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] quota.instances = 10 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.741764] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] quota.key_pairs = 100 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.741944] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] quota.metadata_items = 128 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.742126] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] quota.ram = 51200 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.742295] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] quota.recheck_quota = True {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.742464] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] quota.server_group_members = 10 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.742631] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] quota.server_groups = 10 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.742804] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] rdp.enabled = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.743127] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.743319] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.743490] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.743655] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] scheduler.image_metadata_prefilter = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.743825] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.743987] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] scheduler.max_attempts = 3 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.744163] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] scheduler.max_placement_results = 1000 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.744328] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.744492] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] scheduler.query_placement_for_availability_zone = True {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.744652] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] scheduler.query_placement_for_image_type_support = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.744812] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.744984] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] scheduler.workers = 2 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.745175] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.745348] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.745530] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.745722] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.745899] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.746079] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.746247] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.746440] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.746612] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] filter_scheduler.host_subset_size = 1 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.746779] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.746971] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.747154] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] filter_scheduler.isolated_hosts = [] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.747319] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] filter_scheduler.isolated_images = [] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.747508] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.747702] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.747871] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] filter_scheduler.pci_in_placement = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.748044] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.748209] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.748370] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.748530] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.748690] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.748856] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.749019] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] filter_scheduler.track_instance_changes = True {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.749201] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.749370] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] metrics.required = True {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.749533] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] metrics.weight_multiplier = 1.0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.749695] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.749861] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] metrics.weight_setting = [] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.750182] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.750359] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] serial_console.enabled = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.750538] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] serial_console.port_range = 10000:20000 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.750709] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.750880] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.751059] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] serial_console.serialproxy_port = 6083 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.751229] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] service_user.auth_section = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.751403] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] service_user.auth_type = password {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.751565] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] service_user.cafile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.751725] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] service_user.certfile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.751888] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] service_user.collect_timing = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.752061] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] service_user.insecure = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.752224] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] service_user.keyfile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.752395] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] service_user.send_service_user_token = True {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.752560] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] service_user.split_loggers = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.752719] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] service_user.timeout = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.752888] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] spice.agent_enabled = True {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.753072] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] spice.enabled = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.753363] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.753557] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.753748] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] spice.html5proxy_port = 6082 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.753930] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] spice.image_compression = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.754104] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] spice.jpeg_compression = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.754267] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] spice.playback_compression = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.754437] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] spice.server_listen = 127.0.0.1 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.754605] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.754764] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] spice.streaming_mode = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.754921] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] spice.zlib_compression = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.755097] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] upgrade_levels.baseapi = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.755260] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] upgrade_levels.cert = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.755430] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] upgrade_levels.compute = auto {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.755593] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] upgrade_levels.conductor = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.755759] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] upgrade_levels.scheduler = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.755928] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vendordata_dynamic_auth.auth_section = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.756102] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vendordata_dynamic_auth.auth_type = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.756264] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vendordata_dynamic_auth.cafile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.756420] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vendordata_dynamic_auth.certfile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.756582] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.756744] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vendordata_dynamic_auth.insecure = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.756901] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vendordata_dynamic_auth.keyfile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.757071] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.757231] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vendordata_dynamic_auth.timeout = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.757405] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vmware.api_retry_count = 10 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.757584] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vmware.ca_file = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.757772] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vmware.cache_prefix = devstack-image-cache {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.757943] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vmware.cluster_name = testcl1 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.758123] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vmware.connection_pool_size = 10 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.758287] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vmware.console_delay_seconds = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.758457] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vmware.datastore_regex = ^datastore.* {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.758675] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.758873] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vmware.host_password = **** {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.759054] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vmware.host_port = 443 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.759232] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vmware.host_username = administrator@vsphere.local {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.759404] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vmware.insecure = True {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.759566] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vmware.integration_bridge = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.759732] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vmware.maximum_objects = 100 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.759903] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vmware.pbm_default_policy = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.760098] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vmware.pbm_enabled = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.760263] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vmware.pbm_wsdl_location = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.760434] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.760593] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vmware.serial_port_proxy_uri = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.760751] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vmware.serial_port_service_uri = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.760914] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vmware.task_poll_interval = 0.5 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.761097] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vmware.use_linked_clone = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.761266] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vmware.vnc_keymap = en-us {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.761436] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vmware.vnc_port = 5900 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.761601] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vmware.vnc_port_total = 10000 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.761786] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vnc.auth_schemes = ['none'] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.761966] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vnc.enabled = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.762264] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.762448] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.762618] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vnc.novncproxy_port = 6080 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.762796] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vnc.server_listen = 127.0.0.1 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.762968] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.763143] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vnc.vencrypt_ca_certs = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.763305] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vnc.vencrypt_client_cert = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.763461] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vnc.vencrypt_client_key = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.763640] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.763807] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] workarounds.disable_deep_image_inspection = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.763967] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.764141] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] workarounds.disable_group_policy_check_upcall = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.764301] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.764462] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] workarounds.disable_rootwrap = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.764623] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] workarounds.enable_numa_live_migration = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.764786] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.764951] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.765120] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.765283] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] workarounds.libvirt_disable_apic = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.765444] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.765629] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.765809] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.765976] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.766153] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.766314] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.766474] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.766635] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.766795] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.766961] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.767161] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.767334] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] wsgi.client_socket_timeout = 900 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.767499] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] wsgi.default_pool_size = 1000 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.767714] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] wsgi.keep_alive = True {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.767887] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] wsgi.max_header_line = 16384 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.768062] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] wsgi.secure_proxy_ssl_header = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.768225] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] wsgi.ssl_ca_file = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.768384] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] wsgi.ssl_cert_file = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.768544] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] wsgi.ssl_key_file = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.768750] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] wsgi.tcp_keepidle = 600 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.768972] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.769156] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] zvm.ca_file = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.769321] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] zvm.cloud_connector_url = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.769603] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.769823] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] zvm.reachable_timeout = 300 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.770061] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_policy.enforce_new_defaults = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.770249] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_policy.enforce_scope = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.770429] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_policy.policy_default_rule = default {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.770617] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.770797] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_policy.policy_file = policy.yaml {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.770971] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.771167] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.771305] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.771462] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.771624] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.771797] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.771973] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.772167] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] profiler.connection_string = messaging:// {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.772337] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] profiler.enabled = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.772505] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] profiler.es_doc_type = notification {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.772669] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] profiler.es_scroll_size = 10000 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.772838] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] profiler.es_scroll_time = 2m {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.773006] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] profiler.filter_error_trace = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.773181] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] profiler.hmac_keys = SECRET_KEY {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.773349] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] profiler.sentinel_service_name = mymaster {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.773518] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] profiler.socket_timeout = 0.1 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.773681] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] profiler.trace_sqlalchemy = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.773848] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] remote_debug.host = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.774014] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] remote_debug.port = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.774197] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.774361] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.774524] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.774687] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.774850] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.775022] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.775182] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.775344] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.775505] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.775716] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.775910] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.776093] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.776262] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.776427] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.776588] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.776765] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.776926] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.777095] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.777262] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.777508] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.777716] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.777890] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.778064] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.778230] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.778398] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.778565] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_rabbit.ssl = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.778742] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.778910] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.779080] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.779254] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.779421] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_rabbit.ssl_version = {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.779610] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.779776] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_notifications.retry = -1 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.779957] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.780143] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_messaging_notifications.transport_url = **** {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.780315] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_limit.auth_section = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.780476] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_limit.auth_type = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.780633] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_limit.cafile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.780808] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_limit.certfile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.780980] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_limit.collect_timing = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.781154] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_limit.connect_retries = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.781313] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_limit.connect_retry_delay = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.781468] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_limit.endpoint_id = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.781628] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_limit.endpoint_override = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.781790] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_limit.insecure = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.781948] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_limit.keyfile = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.782125] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_limit.max_version = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.782284] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_limit.min_version = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.782440] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_limit.region_name = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.782597] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_limit.service_name = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.782756] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_limit.service_type = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.782916] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_limit.split_loggers = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.783084] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_limit.status_code_retries = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.783244] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_limit.status_code_retry_delay = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.783397] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_limit.timeout = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.783552] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_limit.valid_interfaces = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.783706] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_limit.version = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.783865] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_reports.file_event_handler = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.784037] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.784197] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] oslo_reports.log_dir = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.784365] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.784522] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vif_plug_linux_bridge_privileged.group = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.784678] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.784842] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.785007] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.785170] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vif_plug_linux_bridge_privileged.user = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.785337] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.785492] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vif_plug_ovs_privileged.group = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.785648] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vif_plug_ovs_privileged.helper_command = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.785809] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.785968] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.786135] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] vif_plug_ovs_privileged.user = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.786301] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] os_vif_linux_bridge.flat_interface = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.786479] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.786650] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.786822] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.786991] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.787176] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.787342] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.787501] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] os_vif_linux_bridge.vlan_interface = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.787715] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] os_vif_ovs.isolate_vif = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.787887] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.788066] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.788240] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.788410] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] os_vif_ovs.ovsdb_interface = native {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.788575] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] os_vif_ovs.per_port_bridge = False {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.788762] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] os_brick.lock_path = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.788939] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] os_brick.wait_mpath_device_attempts = 4 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.789114] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] os_brick.wait_mpath_device_interval = 1 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.789285] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] privsep_osbrick.capabilities = [21] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.789444] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] privsep_osbrick.group = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.789600] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] privsep_osbrick.helper_command = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.789766] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.789955] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] privsep_osbrick.thread_pool_size = 8 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.790136] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] privsep_osbrick.user = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.790313] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.790470] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] nova_sys_admin.group = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.790624] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] nova_sys_admin.helper_command = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.790790] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.790948] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] nova_sys_admin.thread_pool_size = 8 {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.791137] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] nova_sys_admin.user = None {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 638.791270] env[59577]: DEBUG oslo_service.service [None req-9ef5f495-24ad-4011-8a72-c6cabf678fd1 None None] ******************************************************************************** {{(pid=59577) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} [ 638.791678] env[59577]: INFO nova.service [-] Starting compute node (version 0.1.0) [ 638.800089] env[59577]: INFO nova.virt.node [None req-9db88eca-5ff4-48b5-8a92-a30bf92f4f05 None None] Generated node identity cbad7164-1dca-4b60-b95b-712603801988 [ 638.800322] env[59577]: INFO nova.virt.node [None req-9db88eca-5ff4-48b5-8a92-a30bf92f4f05 None None] Wrote node identity cbad7164-1dca-4b60-b95b-712603801988 to /opt/stack/data/n-cpu-1/compute_id [ 638.810868] env[59577]: WARNING nova.compute.manager [None req-9db88eca-5ff4-48b5-8a92-a30bf92f4f05 None None] Compute nodes ['cbad7164-1dca-4b60-b95b-712603801988'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 638.841409] env[59577]: INFO nova.compute.manager [None req-9db88eca-5ff4-48b5-8a92-a30bf92f4f05 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 638.862719] env[59577]: WARNING nova.compute.manager [None req-9db88eca-5ff4-48b5-8a92-a30bf92f4f05 None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 638.862948] env[59577]: DEBUG oslo_concurrency.lockutils [None req-9db88eca-5ff4-48b5-8a92-a30bf92f4f05 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 638.863173] env[59577]: DEBUG oslo_concurrency.lockutils [None req-9db88eca-5ff4-48b5-8a92-a30bf92f4f05 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 638.863321] env[59577]: DEBUG oslo_concurrency.lockutils [None req-9db88eca-5ff4-48b5-8a92-a30bf92f4f05 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 638.863478] env[59577]: DEBUG nova.compute.resource_tracker [None req-9db88eca-5ff4-48b5-8a92-a30bf92f4f05 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59577) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 638.864592] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e3135aa-0013-4594-a80f-1680d430957a {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.873269] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69ba4e61-c606-4461-a1db-4057683e3592 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.886636] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8abab375-3cff-4005-824d-3a39d024d867 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.892623] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-debb37c0-9ee6-416e-96df-9983b508039e {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.920589] env[59577]: DEBUG nova.compute.resource_tracker [None req-9db88eca-5ff4-48b5-8a92-a30bf92f4f05 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181315MB free_disk=175GB free_vcpus=48 pci_devices=None {{(pid=59577) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 638.920732] env[59577]: DEBUG oslo_concurrency.lockutils [None req-9db88eca-5ff4-48b5-8a92-a30bf92f4f05 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 638.920912] env[59577]: DEBUG oslo_concurrency.lockutils [None req-9db88eca-5ff4-48b5-8a92-a30bf92f4f05 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 638.932192] env[59577]: WARNING nova.compute.resource_tracker [None req-9db88eca-5ff4-48b5-8a92-a30bf92f4f05 None None] No compute node record for cpu-1:cbad7164-1dca-4b60-b95b-712603801988: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cbad7164-1dca-4b60-b95b-712603801988 could not be found. [ 638.943935] env[59577]: INFO nova.compute.resource_tracker [None req-9db88eca-5ff4-48b5-8a92-a30bf92f4f05 None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: cbad7164-1dca-4b60-b95b-712603801988 [ 638.993054] env[59577]: DEBUG nova.compute.resource_tracker [None req-9db88eca-5ff4-48b5-8a92-a30bf92f4f05 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 638.993266] env[59577]: DEBUG nova.compute.resource_tracker [None req-9db88eca-5ff4-48b5-8a92-a30bf92f4f05 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 639.097316] env[59577]: INFO nova.scheduler.client.report [None req-9db88eca-5ff4-48b5-8a92-a30bf92f4f05 None None] [req-1b48c116-d702-4ab6-ac3f-c7f515861eeb] Created resource provider record via placement API for resource provider with UUID cbad7164-1dca-4b60-b95b-712603801988 and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 639.114443] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5ff266a-c2c7-4889-9c16-1a0563ca4174 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.121812] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6ad3b2a-2d13-431e-974d-7db3087ced21 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.150562] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca11d5a5-b428-4b08-8790-b76a78de272e {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.157133] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-402e54a9-4a62-4d4d-8b51-0533bb7e7a11 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.169871] env[59577]: DEBUG nova.compute.provider_tree [None req-9db88eca-5ff4-48b5-8a92-a30bf92f4f05 None None] Updating inventory in ProviderTree for provider cbad7164-1dca-4b60-b95b-712603801988 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 639.204170] env[59577]: DEBUG nova.scheduler.client.report [None req-9db88eca-5ff4-48b5-8a92-a30bf92f4f05 None None] Updated inventory for provider cbad7164-1dca-4b60-b95b-712603801988 with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 639.204404] env[59577]: DEBUG nova.compute.provider_tree [None req-9db88eca-5ff4-48b5-8a92-a30bf92f4f05 None None] Updating resource provider cbad7164-1dca-4b60-b95b-712603801988 generation from 0 to 1 during operation: update_inventory {{(pid=59577) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 639.204549] env[59577]: DEBUG nova.compute.provider_tree [None req-9db88eca-5ff4-48b5-8a92-a30bf92f4f05 None None] Updating inventory in ProviderTree for provider cbad7164-1dca-4b60-b95b-712603801988 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 639.243622] env[59577]: DEBUG nova.compute.provider_tree [None req-9db88eca-5ff4-48b5-8a92-a30bf92f4f05 None None] Updating resource provider cbad7164-1dca-4b60-b95b-712603801988 generation from 1 to 2 during operation: update_traits {{(pid=59577) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 639.260395] env[59577]: DEBUG nova.compute.resource_tracker [None req-9db88eca-5ff4-48b5-8a92-a30bf92f4f05 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59577) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 639.260579] env[59577]: DEBUG oslo_concurrency.lockutils [None req-9db88eca-5ff4-48b5-8a92-a30bf92f4f05 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.340s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 639.260743] env[59577]: DEBUG nova.service [None req-9db88eca-5ff4-48b5-8a92-a30bf92f4f05 None None] Creating RPC server for service compute {{(pid=59577) start /opt/stack/nova/nova/service.py:182}} [ 639.272686] env[59577]: DEBUG nova.service [None req-9db88eca-5ff4-48b5-8a92-a30bf92f4f05 None None] Join ServiceGroup membership for this service compute {{(pid=59577) start /opt/stack/nova/nova/service.py:199}} [ 639.272870] env[59577]: DEBUG nova.servicegroup.drivers.db [None req-9db88eca-5ff4-48b5-8a92-a30bf92f4f05 None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=59577) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 677.276834] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._sync_power_states {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 677.287754] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Getting list of instances from cluster (obj){ [ 677.287754] env[59577]: value = "domain-c8" [ 677.287754] env[59577]: _type = "ClusterComputeResource" [ 677.287754] env[59577]: } {{(pid=59577) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 677.288920] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f935009-7f16-46e1-9326-3034046d23f5 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 677.298767] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Got total of 0 instances {{(pid=59577) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 677.298983] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 677.299297] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Getting list of instances from cluster (obj){ [ 677.299297] env[59577]: value = "domain-c8" [ 677.299297] env[59577]: _type = "ClusterComputeResource" [ 677.299297] env[59577]: } {{(pid=59577) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 677.300126] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f874fde-35f2-4d90-b13f-9343eeff1be2 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 677.307670] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Got total of 0 instances {{(pid=59577) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 695.053823] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 695.053823] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 695.053823] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Starting heal instance info cache {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 695.053823] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Rebuilding the list of instances to heal {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 695.064230] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Didn't find any instances for network info cache update. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 695.064507] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 695.064841] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 695.065319] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 695.065714] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 695.065857] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 695.066040] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 695.066232] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59577) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 695.066391] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 695.077812] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 695.078053] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 695.078219] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 695.078398] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59577) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 695.079480] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bc8069e-a5e2-4d92-aa0b-f1416e63ea68 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 695.088194] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-168abac4-37dd-4ecf-a5f1-1008768b98b3 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 695.102833] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd3e75fc-6a5c-43fb-9b6a-f7969e00f280 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 695.109764] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60f5ecc5-5e58-40a6-9767-32b0f2054594 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 695.140053] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181322MB free_disk=175GB free_vcpus=48 pci_devices=None {{(pid=59577) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 695.140500] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 695.140500] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 695.178883] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 695.179318] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 695.201616] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67babdbc-882a-40cb-8389-db6855452149 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 695.209239] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d23c1d1-d3be-487b-9ac4-37cac8209c9a {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 695.239569] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36a79b01-0f30-4cef-a35e-b343ca95f6cb {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 695.246985] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4725a59-b3d3-42a2-aea6-a9ba356a480d {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 695.261535] env[59577]: DEBUG nova.compute.provider_tree [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 695.270126] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 695.271302] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59577) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 695.271474] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.131s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 755.257609] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 755.258040] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 755.268523] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 755.268764] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Starting heal instance info cache {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 755.268879] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Rebuilding the list of instances to heal {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 755.277991] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Didn't find any instances for network info cache update. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 755.278220] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 755.278377] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 755.278519] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 755.278665] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 755.278793] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59577) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 756.044299] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 756.044615] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 756.054277] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 756.054477] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 756.054637] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 756.054790] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59577) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 756.055843] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac567f43-8e66-4431-a2aa-171923161136 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.064636] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66ae8e97-91df-423e-ad18-2f75af35a8b5 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.078008] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24f4c624-4970-48d9-897d-bea4808356d5 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.084099] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bfd430ad-fe21-449a-9b7f-4fd5858e4011 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.113011] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181325MB free_disk=175GB free_vcpus=48 pci_devices=None {{(pid=59577) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 756.113213] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 756.113343] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 756.143211] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 756.143382] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 756.156563] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6c44413-e466-4c65-bc1c-5741866d94f8 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.164245] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85c995af-e5f5-4088-9e2a-ade6f2dea809 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.192982] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-621d5057-60a3-4c27-9cdf-b2eef9f63180 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.199997] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aaaa78aa-b279-46f2-a127-9fed2669c921 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.212722] env[59577]: DEBUG nova.compute.provider_tree [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 756.220500] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 756.221624] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59577) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 756.221794] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.108s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 757.222196] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 815.045308] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 815.045741] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 816.045306] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 816.045506] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Starting heal instance info cache {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 816.045744] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Rebuilding the list of instances to heal {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 816.054475] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Didn't find any instances for network info cache update. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 816.054667] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 816.054830] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 816.054981] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 816.055128] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59577) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 816.055497] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 816.064515] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 816.064735] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 816.064899] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 816.065058] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59577) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 816.066089] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-145bbab4-f268-44a5-a773-c530ebb811cf {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 816.075084] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb8c9791-7284-4051-aae4-42d012089308 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 816.088661] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2079e7a5-c356-4d68-a17e-9bb9f96231d3 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 816.095097] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-888d9a14-47e4-47bd-92b7-47a2e068abf4 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 816.123532] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181325MB free_disk=175GB free_vcpus=48 pci_devices=None {{(pid=59577) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 816.123671] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 816.123810] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 816.157797] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 816.157955] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 816.171102] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72ea5a09-81b6-4a2b-88b7-46ce0382bdb0 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 816.178526] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95a0b908-845a-4c98-8593-10504ca1ebc4 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 816.207931] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-150730b5-74d0-4c95-99d4-25f0c333ccaf {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 816.215401] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1436f577-bff5-49cf-9450-346f75d94123 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 816.228293] env[59577]: DEBUG nova.compute.provider_tree [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 816.236506] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 816.237610] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59577) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 816.237780] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.114s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 817.232358] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 818.044465] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 875.044898] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 875.044898] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 876.046112] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 876.046112] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 877.045315] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 877.045475] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59577) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 877.045628] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 877.055628] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 877.055956] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 877.056032] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 877.056147] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59577) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 877.057236] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-950d5528-f85e-4a05-9cd0-de1a4b798bb0 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 877.065844] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-403a7e00-f7e3-4912-adcd-b852b140a815 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 877.079254] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a430d95-4296-462b-bd39-364379246bd2 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 877.085554] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac0b0156-a964-4c3b-8cd7-02e8739eb38d {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 877.114155] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181325MB free_disk=175GB free_vcpus=48 pci_devices=None {{(pid=59577) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 877.114298] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 877.114478] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 877.142973] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 877.143156] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 877.155876] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7e87c4e-5982-41b4-bf0b-269efe0937c2 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 877.163231] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8001472d-b9d1-42d7-85c0-eec29fdf8399 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 877.191440] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d86883e-0ec5-4a40-bae8-b8b2ca5001ec {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 877.198124] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da7a746f-bdd6-4650-8d05-b00005fddefe {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 877.210682] env[59577]: DEBUG nova.compute.provider_tree [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 877.218053] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 877.219149] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59577) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 877.219319] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.105s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 878.214603] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 878.224433] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 878.224609] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Starting heal instance info cache {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 878.224703] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Rebuilding the list of instances to heal {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 878.233800] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Didn't find any instances for network info cache update. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 879.059246] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 880.045166] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 935.044814] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 935.045258] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 935.045258] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Cleaning up deleted instances {{(pid=59577) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11101}} [ 935.058383] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] There are 0 instances to clean {{(pid=59577) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11110}} [ 935.058614] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 935.058736] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Cleaning up deleted instances with incomplete migration {{(pid=59577) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11139}} [ 935.067686] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 936.074627] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 936.075065] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 937.044946] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 937.054741] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 937.054948] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 937.055124] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 937.055284] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59577) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 937.056334] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7142018b-3310-460e-a273-4fee08f898a1 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 937.065100] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3eeba65-b3d8-4ba9-bac0-12b37ed3216e {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 937.078193] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1ae908f-d698-4a32-bf3b-da43257702ab {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 937.084425] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2564b44c-2664-4e9a-b8c1-b7506b2e3284 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 937.113305] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181333MB free_disk=175GB free_vcpus=48 pci_devices=None {{(pid=59577) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 937.113478] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 937.113647] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 937.144022] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 937.144022] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 937.158698] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3a712fc-afbd-4fcc-aed0-5dc5c539ede2 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 937.165955] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2399162-c05d-4fd3-b195-305a89ad4a5c {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 937.194151] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0672941c-b0aa-42e1-b242-c631f117da30 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 937.200600] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb7b2f9e-b12d-4a0c-8bc0-63e62a70917a {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 937.212832] env[59577]: DEBUG nova.compute.provider_tree [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 937.220017] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 937.221066] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59577) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 937.221289] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.108s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 938.221554] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 938.221874] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 938.221912] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59577) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 939.045987] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 939.046260] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Starting heal instance info cache {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 939.046307] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Rebuilding the list of instances to heal {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 939.054803] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Didn't find any instances for network info cache update. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 940.048549] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 942.045380] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 995.045568] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 996.046027] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 996.046457] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 997.044998] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 997.054642] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 997.054959] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 997.055075] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 997.055202] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59577) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 997.056272] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-248ea0d5-b6d9-4228-83fb-2cc244109982 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.065098] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c645174-f370-4f22-9d3a-213b8caaa3d8 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.078387] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00bc8ba8-ce8f-4f05-a008-4335fd202052 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.084430] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-688dc722-c5d7-4069-a7d0-684fbced8771 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.112538] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181334MB free_disk=175GB free_vcpus=48 pci_devices=None {{(pid=59577) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 997.112663] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 997.112857] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 997.180269] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 997.180438] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 997.197149] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Refreshing inventories for resource provider cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 997.208898] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Updating ProviderTree inventory for provider cbad7164-1dca-4b60-b95b-712603801988 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 997.209082] env[59577]: DEBUG nova.compute.provider_tree [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Updating inventory in ProviderTree for provider cbad7164-1dca-4b60-b95b-712603801988 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 997.220125] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Refreshing aggregate associations for resource provider cbad7164-1dca-4b60-b95b-712603801988, aggregates: None {{(pid=59577) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 997.235536] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Refreshing trait associations for resource provider cbad7164-1dca-4b60-b95b-712603801988, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE {{(pid=59577) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 997.246908] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac6dee22-20fd-4e2a-b7c2-2140e5e81004 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.255279] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce43db9e-23e1-47f0-9741-321c168b1496 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.283844] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61be7bd3-c8bb-4f54-94dc-6bba9a89b3e7 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.290478] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc47a7dd-4753-478b-ba80-c16ea3ebc663 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.303696] env[59577]: DEBUG nova.compute.provider_tree [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 997.311688] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 997.312817] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59577) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 997.313013] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.200s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 999.312471] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1000.045693] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1000.045877] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59577) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1001.040666] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1001.041091] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1001.050406] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1001.050553] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Starting heal instance info cache {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1001.050667] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Rebuilding the list of instances to heal {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1001.058954] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Didn't find any instances for network info cache update. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1004.045420] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1056.044804] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1056.045194] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1056.045466] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1057.046045] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1057.054825] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1057.055037] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1057.055204] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1057.055355] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59577) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1057.056439] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6533dc20-4652-4cbe-8626-270b07e0ee28 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.064973] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91692360-02ce-4fff-a968-3afb05963962 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.078336] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b734ce05-a7fd-485b-a422-38ed3d8dc52d {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.084307] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afa7c553-b024-4896-adbb-0a62ab12284a {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.111868] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181321MB free_disk=175GB free_vcpus=48 pci_devices=None {{(pid=59577) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1057.112015] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1057.112194] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1057.140636] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1057.140801] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1057.154161] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a743f06b-49c5-42e0-a7f9-02898ce5b1cd {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.161426] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9251d88b-feb3-4634-bb43-bde681f8c631 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.191556] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b36a606-a47f-4c0d-b506-33a134dae876 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.198571] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac4cb3b7-0b81-4827-9a65-b60394d27eb2 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.210919] env[59577]: DEBUG nova.compute.provider_tree [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1057.218806] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1057.219907] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59577) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1057.220087] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.108s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1059.221096] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1061.040177] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1062.044999] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1062.045383] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59577) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1063.045922] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1063.046502] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Starting heal instance info cache {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1063.046502] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Rebuilding the list of instances to heal {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1063.054800] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Didn't find any instances for network info cache update. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1065.045368] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1116.044894] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1117.045775] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1118.044701] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1119.046044] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1119.055180] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1119.055385] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1119.055591] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1119.055754] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59577) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1119.056788] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-973fc36c-ab05-4ffd-8205-230472cb60b2 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1119.065172] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d6553a9-f9e7-47a3-bffa-95b84ca85f1a {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1119.078454] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-340a0e6c-36b7-48d8-9caa-622e8fcbe0ae {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1119.084543] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e38196c5-209e-4177-be7f-01271ecfaf4d {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1119.112681] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181331MB free_disk=175GB free_vcpus=48 pci_devices=None {{(pid=59577) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1119.112816] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1119.112996] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1119.143028] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1119.143028] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1119.156726] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c59b3dc0-bedc-4bac-97ef-962b83aac727 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1119.164289] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-642ddc72-d439-44ee-8b54-8c6305670455 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1119.193445] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-206d2142-1c84-4796-9d71-96cdad42f0f4 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1119.200731] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7906ac63-9c38-4711-8464-61d80b658b7d {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1119.214565] env[59577]: DEBUG nova.compute.provider_tree [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1119.222652] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1119.223776] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59577) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1119.223959] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.111s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1120.223727] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1122.040468] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1123.040225] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1123.049349] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1123.049663] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59577) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1124.045581] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1124.045825] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Starting heal instance info cache {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1124.045902] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Rebuilding the list of instances to heal {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1124.054657] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Didn't find any instances for network info cache update. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1125.044073] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1177.046086] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1178.044893] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1179.045621] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1181.046037] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1181.055973] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1181.056205] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1181.056364] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1181.056513] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59577) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1181.057544] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-538792d8-8879-449d-a61c-cc71eff3257f {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1181.066051] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83336011-c9b8-412f-8849-82584a17a648 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1181.079392] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19b99170-8371-44fd-bd0e-afc196818f17 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1181.085319] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-459892a4-9f29-4475-ba9e-16952ee7f102 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1181.113659] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181322MB free_disk=175GB free_vcpus=48 pci_devices=None {{(pid=59577) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1181.113807] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1181.113973] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1181.144455] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1181.144624] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1181.159170] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d52c886-8507-44e6-b075-b7993eb87d28 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1181.166512] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-001a8814-3ed0-4bb6-8564-d97926d61992 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1181.194616] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ca5825c-2de1-428b-b610-0805df12fc7a {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1181.201225] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7a677a2-60f5-40cb-bc87-3be0fc564ca3 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1181.213447] env[59577]: DEBUG nova.compute.provider_tree [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1181.220845] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1181.221922] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59577) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1181.222106] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.108s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1182.216471] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1182.216846] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1185.045216] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1185.045647] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59577) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1186.045645] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1186.046072] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Starting heal instance info cache {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1186.046072] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Rebuilding the list of instances to heal {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1186.054775] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Didn't find any instances for network info cache update. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1186.054971] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1237.046758] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1238.045523] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1238.045772] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Cleaning up deleted instances with incomplete migration {{(pid=59577) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11139}} [ 1239.054491] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1240.046072] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1240.046072] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1240.046072] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Cleaning up deleted instances {{(pid=59577) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11101}} [ 1240.058027] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] There are 0 instances to clean {{(pid=59577) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11110}} [ 1242.058572] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1243.040710] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1243.044302] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1243.055071] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1243.055071] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1243.055071] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1243.055337] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59577) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1243.056239] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0611a04d-5a28-46d9-b08e-4d57963243f3 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1243.064212] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-358db7b3-17bf-4337-9979-e8f6123f4309 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1243.077745] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae90bbd5-f9c2-409b-9f98-8bf666afe610 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1243.083660] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38c8eb0a-53a3-4df3-b43f-ab4a55fed0d0 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1243.112517] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181316MB free_disk=175GB free_vcpus=48 pci_devices=None {{(pid=59577) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1243.112667] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1243.112836] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1243.143524] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1243.143691] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1243.156856] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-191d1232-2f35-440f-8673-4b644050b609 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1243.163672] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf6a9b9b-c38d-454c-8349-bfc47e543a34 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1243.192902] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fec0c510-c630-49a4-9b53-cca7c0122946 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1243.199414] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12bfafa8-11ed-465e-81d4-f3665bab4c42 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1243.211943] env[59577]: DEBUG nova.compute.provider_tree [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1243.219881] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1243.220976] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59577) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1243.221162] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.108s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1245.222079] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1245.222488] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59577) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1246.040606] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1248.045119] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1248.045119] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Starting heal instance info cache {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1248.045119] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Rebuilding the list of instances to heal {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1248.054876] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Didn't find any instances for network info cache update. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1248.054876] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1248.054876] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1277.285710] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._sync_power_states {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1277.294711] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Getting list of instances from cluster (obj){ [ 1277.294711] env[59577]: value = "domain-c8" [ 1277.294711] env[59577]: _type = "ClusterComputeResource" [ 1277.294711] env[59577]: } {{(pid=59577) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1277.295762] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2893e4cc-90ef-43be-984c-5be1ef6f30e3 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1277.304730] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Got total of 0 instances {{(pid=59577) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1298.063760] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1300.045436] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1302.046647] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1304.045974] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1305.040786] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1305.045383] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1305.045383] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59577) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1305.045383] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1305.058026] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1305.058026] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1305.058026] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1305.058026] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59577) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1305.058026] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-224f3f84-414f-4075-b539-000d2731803c {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1305.066736] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff7b4c1e-5dac-42b4-b112-80a031b629fa {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1305.080592] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9455c137-12a4-4137-8ad8-7c111461145d {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1305.088020] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c85536d-9925-4006-b718-29dbd9cf036a {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1305.116991] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181326MB free_disk=175GB free_vcpus=48 pci_devices=None {{(pid=59577) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1305.117348] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1305.117672] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1305.214358] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1305.214358] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1305.227410] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Refreshing inventories for resource provider cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1305.240928] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Updating ProviderTree inventory for provider cbad7164-1dca-4b60-b95b-712603801988 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1305.240928] env[59577]: DEBUG nova.compute.provider_tree [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Updating inventory in ProviderTree for provider cbad7164-1dca-4b60-b95b-712603801988 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1305.249281] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Refreshing aggregate associations for resource provider cbad7164-1dca-4b60-b95b-712603801988, aggregates: None {{(pid=59577) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1305.266118] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Refreshing trait associations for resource provider cbad7164-1dca-4b60-b95b-712603801988, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE {{(pid=59577) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1305.276204] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ab6b08d-3a00-4cb3-bde1-4626ec4836a1 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1305.284026] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57c71c89-f496-48c8-aca0-5c7e4faa38c1 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1305.312663] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed9b7dc5-6846-4793-bf9d-3029bf4e14e3 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1305.319660] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf453704-a7af-4a1e-9475-bd70dbcfc360 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1305.332500] env[59577]: DEBUG nova.compute.provider_tree [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1305.341309] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1305.342688] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59577) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1305.343024] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.225s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1309.343698] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1310.046837] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1310.046837] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Starting heal instance info cache {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1310.046837] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Rebuilding the list of instances to heal {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1310.054288] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Didn't find any instances for network info cache update. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1359.045248] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1360.045233] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1362.046426] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1365.045061] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1365.055184] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1365.055403] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1365.055567] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1365.055719] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59577) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1365.056760] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8227e821-5bc0-459c-a63f-556ddea82297 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1365.065620] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf7818b7-7a54-4959-94f7-d5879e012a18 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1365.079649] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18742fea-12ef-44db-832a-064e64ad61e2 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1365.086056] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44ed690d-2254-458c-9e0d-f954ccdcea41 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1365.114728] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181307MB free_disk=175GB free_vcpus=48 pci_devices=None {{(pid=59577) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1365.114902] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1365.115078] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1365.149248] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1365.149418] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1365.162146] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c9b670e-ec46-4aa6-891e-69a4c3d6e7c4 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1365.169941] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fbb579d-7414-40fc-a7bf-c8c08c37cdef {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1365.199629] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0ff4797-73f8-40cf-8a77-4635069bb0ba {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1365.206390] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-048479f0-43e7-455b-af59-d808d668dff4 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1365.218728] env[59577]: DEBUG nova.compute.provider_tree [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1365.227340] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1365.228459] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59577) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1365.228629] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.114s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1366.224214] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1366.224614] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1366.224614] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1366.224747] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59577) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1369.041429] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1369.051974] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1371.045321] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1371.045693] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Starting heal instance info cache {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1371.045693] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Rebuilding the list of instances to heal {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1371.053961] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Didn't find any instances for network info cache update. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1420.045386] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1420.046117] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1422.046237] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1425.045582] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1425.055487] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1425.055700] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1425.055861] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1425.056024] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59577) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1425.057153] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b95eb86-022b-4880-bd97-0108523ae13e {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1425.066490] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14858bf4-2446-4348-a923-cdae63be2bd1 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1425.080070] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12fa7423-bb84-44fc-a64c-d166883ed637 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1425.086061] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7bb5fc20-d3be-4ea4-a098-f13dee4eba0c {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1425.118863] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181326MB free_disk=175GB free_vcpus=48 pci_devices=None {{(pid=59577) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1425.118991] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1425.119259] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1425.149855] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1425.150030] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1425.163514] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0928c30-fbf3-43a0-a9f8-76c3c838bf54 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1425.171680] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-845a1852-6cc6-4b61-8d48-a8855b508cf0 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1425.202400] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bfba08e-98db-41d4-b52c-8b07eb80e48e {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1425.209496] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bbd6db0b-0058-4522-82b9-d1280dca18af {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1425.222067] env[59577]: DEBUG nova.compute.provider_tree [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1425.229701] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1425.230803] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59577) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1425.230976] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.112s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1426.225322] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1426.225695] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1426.225695] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59577) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1427.045596] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1430.045141] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1431.045284] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1431.045656] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Starting heal instance info cache {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1431.045656] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Rebuilding the list of instances to heal {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1431.054182] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Didn't find any instances for network info cache update. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1481.045109] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1482.045796] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1483.045523] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1487.044707] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1487.045083] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59577) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1487.045083] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1487.054848] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1487.055080] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1487.055247] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1487.055398] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59577) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1487.056440] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1beae1a-088a-4503-82fe-db1db55bd4e2 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1487.065107] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-875cbff9-f0cf-4f4a-b765-da554f8631a3 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1487.078485] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33dda86a-d71b-4fc6-87da-258b0029258f {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1487.084362] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d808883a-9514-48fe-9736-63f84cb63319 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1487.112823] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181325MB free_disk=175GB free_vcpus=48 pci_devices=None {{(pid=59577) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1487.112965] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1487.113174] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1487.143277] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1487.143443] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1487.157195] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c11ba9eb-0f0d-4633-ab1e-8cc2fa617937 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1487.166018] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29ee0b71-5141-490b-b8ff-7c09abecd9f4 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1487.193594] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-599901ac-0f13-4b6e-82ae-4d9a3162e5f3 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1487.201236] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bbc372a5-a926-413f-93ac-3c0d6fc1e48b {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1487.214951] env[59577]: DEBUG nova.compute.provider_tree [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1487.222745] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1487.223837] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59577) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1487.224014] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.111s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1488.219337] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1488.219853] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1490.044573] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1493.046027] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1493.046389] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Starting heal instance info cache {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1493.046389] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Rebuilding the list of instances to heal {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1493.054855] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Didn't find any instances for network info cache update. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1494.048518] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1539.976274] env[59577]: DEBUG oslo_concurrency.lockutils [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Acquiring lock "80b8cf1b-b426-4d0c-9134-65936817451b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1539.976574] env[59577]: DEBUG oslo_concurrency.lockutils [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Lock "80b8cf1b-b426-4d0c-9134-65936817451b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1540.001959] env[59577]: DEBUG nova.compute.manager [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Starting instance... {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1540.123997] env[59577]: DEBUG oslo_concurrency.lockutils [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1540.124321] env[59577]: DEBUG oslo_concurrency.lockutils [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1540.126716] env[59577]: INFO nova.compute.claims [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1540.274021] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06b6795c-6787-49d4-9791-2c1592a211c2 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1540.285624] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-534a43ae-f668-4bbc-bdbd-01ac0a289c8c {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1540.325793] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9af2e09-61b8-4d15-b915-ee2769928e84 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1540.334877] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9e517f2-dbfb-44f7-85bc-2311c3b66b4a {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1540.351782] env[59577]: DEBUG nova.compute.provider_tree [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1540.362148] env[59577]: DEBUG nova.scheduler.client.report [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1540.379342] env[59577]: DEBUG oslo_concurrency.lockutils [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.255s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1540.380181] env[59577]: DEBUG nova.compute.manager [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Start building networks asynchronously for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1540.433030] env[59577]: DEBUG nova.compute.utils [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Using /dev/sd instead of None {{(pid=59577) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1540.437706] env[59577]: DEBUG nova.compute.manager [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Allocating IP information in the background. {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1540.437706] env[59577]: DEBUG nova.network.neutron [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] allocate_for_instance() {{(pid=59577) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1540.449691] env[59577]: DEBUG nova.compute.manager [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Start building block device mappings for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1540.568119] env[59577]: DEBUG nova.compute.manager [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Start spawning the instance on the hypervisor. {{(pid=59577) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1540.713187] env[59577]: DEBUG nova.virt.hardware [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-12-17T13:42:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-12-17T13:42:33Z,direct_url=,disk_format='vmdk',id=d5e691af-5903-46f6-a589-e220c4e5798c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5cabf3c5743c484db7095e0ffc0e5d73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-12-17T13:42:34Z,virtual_size=,visibility=), allow threads: False {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1540.713436] env[59577]: DEBUG nova.virt.hardware [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Flavor limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1540.713588] env[59577]: DEBUG nova.virt.hardware [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Image limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1540.713760] env[59577]: DEBUG nova.virt.hardware [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Flavor pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1540.716807] env[59577]: DEBUG nova.virt.hardware [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Image pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1540.716807] env[59577]: DEBUG nova.virt.hardware [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1540.716807] env[59577]: DEBUG nova.virt.hardware [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1540.716807] env[59577]: DEBUG nova.virt.hardware [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1540.716807] env[59577]: DEBUG nova.virt.hardware [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Got 1 possible topologies {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1540.717073] env[59577]: DEBUG nova.virt.hardware [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1540.717073] env[59577]: DEBUG nova.virt.hardware [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1540.717073] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1bf9e181-8d3a-42ba-90a8-1255642b3817 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1540.729209] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e3ae1e0-cd65-47a2-8a0b-789bcdf4adbb {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1540.749306] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94742d4a-13b6-44f6-92ba-94bd07dfb71f {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1540.980407] env[59577]: DEBUG nova.policy [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '640391b57c2c4e56a85281dff7d3c18c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4aaf7c8faae24c8c875d202131afdcad', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59577) authorize /opt/stack/nova/nova/policy.py:203}} [ 1541.045753] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1541.045753] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Cleaning up deleted instances with incomplete migration {{(pid=59577) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11139}} [ 1541.511047] env[59577]: DEBUG oslo_concurrency.lockutils [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Acquiring lock "02000be7-32ff-4158-8ce7-02bcefe7d81c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1541.511047] env[59577]: DEBUG oslo_concurrency.lockutils [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Lock "02000be7-32ff-4158-8ce7-02bcefe7d81c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1541.530975] env[59577]: DEBUG nova.compute.manager [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Starting instance... {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1541.591819] env[59577]: DEBUG oslo_concurrency.lockutils [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1541.591819] env[59577]: DEBUG oslo_concurrency.lockutils [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1541.591819] env[59577]: INFO nova.compute.claims [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1541.701016] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95ae7e4c-6b4d-40c0-bfc7-670d7956bba7 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1541.712023] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-735e7314-dd94-4a1b-90c6-c3c788382402 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1541.744642] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4af4286-64c8-4ae7-ac79-ac5f583e64d4 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1541.752741] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed4c72ad-00a2-40af-8b27-f798daae4f7b {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1541.766474] env[59577]: DEBUG nova.compute.provider_tree [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1541.775690] env[59577]: DEBUG nova.scheduler.client.report [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1541.789543] env[59577]: DEBUG oslo_concurrency.lockutils [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.201s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1541.790191] env[59577]: DEBUG nova.compute.manager [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Start building networks asynchronously for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1541.838518] env[59577]: DEBUG nova.compute.utils [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Using /dev/sd instead of None {{(pid=59577) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1541.839475] env[59577]: DEBUG nova.compute.manager [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Allocating IP information in the background. {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1541.839854] env[59577]: DEBUG nova.network.neutron [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] allocate_for_instance() {{(pid=59577) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1541.848283] env[59577]: DEBUG nova.compute.manager [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Start building block device mappings for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1541.922295] env[59577]: DEBUG nova.compute.manager [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Start spawning the instance on the hypervisor. {{(pid=59577) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1541.945966] env[59577]: DEBUG nova.virt.hardware [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-12-17T13:42:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-12-17T13:42:33Z,direct_url=,disk_format='vmdk',id=d5e691af-5903-46f6-a589-e220c4e5798c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5cabf3c5743c484db7095e0ffc0e5d73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-12-17T13:42:34Z,virtual_size=,visibility=), allow threads: False {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1541.946202] env[59577]: DEBUG nova.virt.hardware [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Flavor limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1541.947394] env[59577]: DEBUG nova.virt.hardware [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Image limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1541.947394] env[59577]: DEBUG nova.virt.hardware [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Flavor pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1541.947394] env[59577]: DEBUG nova.virt.hardware [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Image pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1541.947394] env[59577]: DEBUG nova.virt.hardware [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1541.947822] env[59577]: DEBUG nova.virt.hardware [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1541.947822] env[59577]: DEBUG nova.virt.hardware [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1541.947928] env[59577]: DEBUG nova.virt.hardware [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Got 1 possible topologies {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1541.948108] env[59577]: DEBUG nova.virt.hardware [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1541.948283] env[59577]: DEBUG nova.virt.hardware [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1541.949172] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-678ad1d5-396b-49fd-8829-3fa8acc4651d {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1541.954402] env[59577]: DEBUG nova.network.neutron [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Successfully created port: e1e08eb7-7a24-45c9-b981-8c802277ca69 {{(pid=59577) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1541.961764] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4c3ad69-6120-40fe-bd51-a29d1947590e {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1542.060058] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1542.258754] env[59577]: DEBUG nova.policy [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4f52459af1dc46ed87818c1cd33273b7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1d5ea5daba2d4288821454563ef79c04', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59577) authorize /opt/stack/nova/nova/policy.py:203}} [ 1543.806243] env[59577]: DEBUG oslo_concurrency.lockutils [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Acquiring lock "3070aef7-432d-4f92-80d5-18efd8cceec3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1543.806243] env[59577]: DEBUG oslo_concurrency.lockutils [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Lock "3070aef7-432d-4f92-80d5-18efd8cceec3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1543.822427] env[59577]: DEBUG nova.compute.manager [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Starting instance... {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1543.879345] env[59577]: DEBUG oslo_concurrency.lockutils [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1543.879627] env[59577]: DEBUG oslo_concurrency.lockutils [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1543.881572] env[59577]: INFO nova.compute.claims [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1544.030449] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc3fb103-a998-40eb-9cbf-a4f0a95493b0 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1544.038937] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1cbc2bb3-ece2-47af-b446-f0b27544e393 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1544.044756] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1544.079454] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-963d87bb-e202-430b-980a-f7bb64bf52d9 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1544.094187] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf531764-b4f7-4f1b-9315-bad19c2c3158 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1544.109597] env[59577]: DEBUG nova.compute.provider_tree [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1544.111967] env[59577]: DEBUG nova.network.neutron [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Successfully created port: b90aa190-c4a1-44ea-b66b-c4112a20b640 {{(pid=59577) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1544.122162] env[59577]: DEBUG nova.scheduler.client.report [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1544.141165] env[59577]: DEBUG oslo_concurrency.lockutils [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.260s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1544.141165] env[59577]: DEBUG nova.compute.manager [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Start building networks asynchronously for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1544.182593] env[59577]: DEBUG nova.compute.utils [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Using /dev/sd instead of None {{(pid=59577) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1544.188155] env[59577]: DEBUG nova.compute.manager [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Allocating IP information in the background. {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1544.188377] env[59577]: DEBUG nova.network.neutron [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] allocate_for_instance() {{(pid=59577) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1544.198712] env[59577]: DEBUG nova.compute.manager [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Start building block device mappings for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1544.289936] env[59577]: DEBUG nova.compute.manager [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Start spawning the instance on the hypervisor. {{(pid=59577) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1544.321418] env[59577]: DEBUG nova.virt.hardware [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-12-17T13:42:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-12-17T13:42:33Z,direct_url=,disk_format='vmdk',id=d5e691af-5903-46f6-a589-e220c4e5798c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5cabf3c5743c484db7095e0ffc0e5d73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-12-17T13:42:34Z,virtual_size=,visibility=), allow threads: False {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1544.321418] env[59577]: DEBUG nova.virt.hardware [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Flavor limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1544.321418] env[59577]: DEBUG nova.virt.hardware [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Image limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1544.322317] env[59577]: DEBUG nova.virt.hardware [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Flavor pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1544.322317] env[59577]: DEBUG nova.virt.hardware [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Image pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1544.322317] env[59577]: DEBUG nova.virt.hardware [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1544.322317] env[59577]: DEBUG nova.virt.hardware [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1544.322317] env[59577]: DEBUG nova.virt.hardware [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1544.322563] env[59577]: DEBUG nova.virt.hardware [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Got 1 possible topologies {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1544.322563] env[59577]: DEBUG nova.virt.hardware [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1544.322563] env[59577]: DEBUG nova.virt.hardware [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1544.323862] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d59e189-8052-48f1-8f1a-dc29a79cb967 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1544.335227] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4161d5c0-35b1-4dee-a96f-15ee47ed3f63 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1544.375520] env[59577]: DEBUG nova.policy [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c1a49cc435b54762b53046b9b2cb7d08', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2e9ca370f23242c8931c1ddcd23b4d58', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59577) authorize /opt/stack/nova/nova/policy.py:203}} [ 1544.708415] env[59577]: DEBUG nova.network.neutron [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Successfully updated port: e1e08eb7-7a24-45c9-b981-8c802277ca69 {{(pid=59577) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1544.721718] env[59577]: DEBUG oslo_concurrency.lockutils [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Acquiring lock "refresh_cache-80b8cf1b-b426-4d0c-9134-65936817451b" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1544.722371] env[59577]: DEBUG oslo_concurrency.lockutils [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Acquired lock "refresh_cache-80b8cf1b-b426-4d0c-9134-65936817451b" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1544.722787] env[59577]: DEBUG nova.network.neutron [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Building network info cache for instance {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1544.880100] env[59577]: DEBUG nova.network.neutron [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Instance cache missing network info. {{(pid=59577) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1545.045731] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1545.935604] env[59577]: DEBUG nova.network.neutron [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Updating instance_info_cache with network_info: [{"id": "e1e08eb7-7a24-45c9-b981-8c802277ca69", "address": "fa:16:3e:40:5c:c5", "network": {"id": "b8835e2b-b650-40af-adda-0f61cbdb109f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.191", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "5cabf3c5743c484db7095e0ffc0e5d73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8ee9f433-666e-4d74-96df-c7c7a6ac7fda", "external-id": "nsx-vlan-transportzone-499", "segmentation_id": 499, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape1e08eb7-7a", "ovs_interfaceid": "e1e08eb7-7a24-45c9-b981-8c802277ca69", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1545.950970] env[59577]: DEBUG oslo_concurrency.lockutils [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Releasing lock "refresh_cache-80b8cf1b-b426-4d0c-9134-65936817451b" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1545.951307] env[59577]: DEBUG nova.compute.manager [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Instance network_info: |[{"id": "e1e08eb7-7a24-45c9-b981-8c802277ca69", "address": "fa:16:3e:40:5c:c5", "network": {"id": "b8835e2b-b650-40af-adda-0f61cbdb109f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.191", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "5cabf3c5743c484db7095e0ffc0e5d73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8ee9f433-666e-4d74-96df-c7c7a6ac7fda", "external-id": "nsx-vlan-transportzone-499", "segmentation_id": 499, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape1e08eb7-7a", "ovs_interfaceid": "e1e08eb7-7a24-45c9-b981-8c802277ca69", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1545.951828] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:40:5c:c5', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '8ee9f433-666e-4d74-96df-c7c7a6ac7fda', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e1e08eb7-7a24-45c9-b981-8c802277ca69', 'vif_model': 'vmxnet3'}] {{(pid=59577) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1545.968342] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1545.969013] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9be59ffd-acd6-43cb-a46c-49f5e32ea131 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1545.985801] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Created folder: OpenStack in parent group-v4. [ 1545.986259] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Creating folder: Project (4aaf7c8faae24c8c875d202131afdcad). Parent ref: group-v398749. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1545.986491] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c8b30de7-5444-4942-8bae-c02bb1f6caf7 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1545.996714] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Created folder: Project (4aaf7c8faae24c8c875d202131afdcad) in parent group-v398749. [ 1545.996922] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Creating folder: Instances. Parent ref: group-v398750. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1545.997924] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-efde3208-fe72-4722-8187-d6a6b3b033df {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1546.007568] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Created folder: Instances in parent group-v398750. [ 1546.007845] env[59577]: DEBUG oslo.service.loopingcall [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59577) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1546.008087] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Creating VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1546.008291] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f7cb312b-0411-4ce6-b02f-a8eaef8b7cef {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1546.031980] env[59577]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1546.031980] env[59577]: value = "task-1933742" [ 1546.031980] env[59577]: _type = "Task" [ 1546.031980] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1546.041741] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933742, 'name': CreateVM_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1546.196246] env[59577]: DEBUG nova.network.neutron [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Successfully created port: 68253d38-c345-48da-b7bd-ca6e649885be {{(pid=59577) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1546.546452] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933742, 'name': CreateVM_Task, 'duration_secs': 0.337468} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1546.546621] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Created VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1546.581022] env[59577]: DEBUG oslo_vmware.service [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91321a43-0fc0-4909-aa85-ae331d47d06b {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1546.590365] env[59577]: DEBUG oslo_concurrency.lockutils [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1546.590365] env[59577]: DEBUG oslo_concurrency.lockutils [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1546.590365] env[59577]: DEBUG oslo_concurrency.lockutils [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1546.590863] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-810a544b-264b-4d03-86cd-fff45a073701 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1546.599555] env[59577]: DEBUG oslo_vmware.api [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Waiting for the task: (returnval){ [ 1546.599555] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]520c2b44-9be4-cd7a-8352-b1a688a13e9b" [ 1546.599555] env[59577]: _type = "Task" [ 1546.599555] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1546.613796] env[59577]: DEBUG oslo_vmware.api [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]520c2b44-9be4-cd7a-8352-b1a688a13e9b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1546.855188] env[59577]: DEBUG nova.compute.manager [req-e39ad177-8b68-497d-ab37-d2bddb2d087e req-c6ae1028-59af-428d-bc4d-0e6e4f98afad service nova] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Received event network-vif-plugged-e1e08eb7-7a24-45c9-b981-8c802277ca69 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1546.855188] env[59577]: DEBUG oslo_concurrency.lockutils [req-e39ad177-8b68-497d-ab37-d2bddb2d087e req-c6ae1028-59af-428d-bc4d-0e6e4f98afad service nova] Acquiring lock "80b8cf1b-b426-4d0c-9134-65936817451b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1546.855188] env[59577]: DEBUG oslo_concurrency.lockutils [req-e39ad177-8b68-497d-ab37-d2bddb2d087e req-c6ae1028-59af-428d-bc4d-0e6e4f98afad service nova] Lock "80b8cf1b-b426-4d0c-9134-65936817451b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1546.855311] env[59577]: DEBUG oslo_concurrency.lockutils [req-e39ad177-8b68-497d-ab37-d2bddb2d087e req-c6ae1028-59af-428d-bc4d-0e6e4f98afad service nova] Lock "80b8cf1b-b426-4d0c-9134-65936817451b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1546.855491] env[59577]: DEBUG nova.compute.manager [req-e39ad177-8b68-497d-ab37-d2bddb2d087e req-c6ae1028-59af-428d-bc4d-0e6e4f98afad service nova] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] No waiting events found dispatching network-vif-plugged-e1e08eb7-7a24-45c9-b981-8c802277ca69 {{(pid=59577) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1546.855644] env[59577]: WARNING nova.compute.manager [req-e39ad177-8b68-497d-ab37-d2bddb2d087e req-c6ae1028-59af-428d-bc4d-0e6e4f98afad service nova] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Received unexpected event network-vif-plugged-e1e08eb7-7a24-45c9-b981-8c802277ca69 for instance with vm_state building and task_state spawning. [ 1547.112160] env[59577]: DEBUG oslo_concurrency.lockutils [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1547.112160] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Processing image d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1547.112160] env[59577]: DEBUG oslo_concurrency.lockutils [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1547.112569] env[59577]: DEBUG oslo_concurrency.lockutils [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1547.113310] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1547.113310] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9e438a07-9acb-44a2-a9dd-126bbdc3ff62 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1547.142709] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1547.142912] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59577) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1547.144328] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1dc9f369-fcb5-4583-931a-4dcd9b451381 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1547.153342] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f625f7eb-a7a5-45e7-ada0-46947345b35c {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1547.158732] env[59577]: DEBUG oslo_vmware.api [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Waiting for the task: (returnval){ [ 1547.158732] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]52f95e6b-4b08-372e-e9e6-fad50458bf89" [ 1547.158732] env[59577]: _type = "Task" [ 1547.158732] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1547.167126] env[59577]: DEBUG oslo_vmware.api [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]52f95e6b-4b08-372e-e9e6-fad50458bf89, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1547.509406] env[59577]: DEBUG nova.network.neutron [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Successfully updated port: b90aa190-c4a1-44ea-b66b-c4112a20b640 {{(pid=59577) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1547.528915] env[59577]: DEBUG oslo_concurrency.lockutils [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Acquiring lock "refresh_cache-02000be7-32ff-4158-8ce7-02bcefe7d81c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1547.529068] env[59577]: DEBUG oslo_concurrency.lockutils [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Acquired lock "refresh_cache-02000be7-32ff-4158-8ce7-02bcefe7d81c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1547.529216] env[59577]: DEBUG nova.network.neutron [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Building network info cache for instance {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1547.673132] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Preparing fetch location {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1547.673132] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Creating directory with path [datastore1] vmware_temp/375b5dde-16bc-4556-a155-bea020f32acc/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1547.673530] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8b4b21ef-868d-45a2-bfa7-b97645a43718 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1547.696246] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Created directory with path [datastore1] vmware_temp/375b5dde-16bc-4556-a155-bea020f32acc/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1547.696472] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Fetch image to [datastore1] vmware_temp/375b5dde-16bc-4556-a155-bea020f32acc/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1547.696644] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to [datastore1] vmware_temp/375b5dde-16bc-4556-a155-bea020f32acc/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk on the data store datastore1 {{(pid=59577) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1547.697466] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5080f65c-0970-43f7-a2e5-226c20a343e0 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1547.709237] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79dd4194-f352-4e2d-af4d-b7870a9ce0d3 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1547.723934] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d91dfb5-bcee-47e3-8b5f-eea827c932d8 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1547.761152] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23f301fe-407f-405f-87aa-52a74ead1fe6 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1547.768280] env[59577]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-faff7fca-aab2-4207-83c2-0803f668c836 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1547.801029] env[59577]: DEBUG nova.virt.vmwareapi.images [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to the data store datastore1 {{(pid=59577) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1547.805215] env[59577]: DEBUG nova.network.neutron [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Instance cache missing network info. {{(pid=59577) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1547.882939] env[59577]: DEBUG oslo_vmware.rw_handles [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/375b5dde-16bc-4556-a155-bea020f32acc/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59577) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1547.956264] env[59577]: DEBUG oslo_vmware.rw_handles [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Completed reading data from the image iterator. {{(pid=59577) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1547.956753] env[59577]: DEBUG oslo_vmware.rw_handles [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/375b5dde-16bc-4556-a155-bea020f32acc/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59577) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1548.046622] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1548.046622] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59577) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1548.046622] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1548.065363] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1548.065363] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1548.065363] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1548.065363] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59577) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1548.070158] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc018f55-142e-445b-8bb1-4329391d2d63 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1548.081286] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c844ab10-ae11-49e6-985e-133c23a40909 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1548.115895] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93bf876b-6801-49b9-8bd3-3c58ec30c48a {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1548.129860] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d287918e-0207-4f31-be64-70c088a53f03 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1548.172279] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181323MB free_disk=175GB free_vcpus=48 pci_devices=None {{(pid=59577) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1548.172279] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1548.172279] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1548.234788] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 80b8cf1b-b426-4d0c-9134-65936817451b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1548.234943] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 02000be7-32ff-4158-8ce7-02bcefe7d81c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1548.235083] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 3070aef7-432d-4f92-80d5-18efd8cceec3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1548.235287] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Total usable vcpus: 48, total allocated vcpus: 3 {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1548.235587] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=896MB phys_disk=200GB used_disk=3GB total_vcpus=48 used_vcpus=3 pci_stats=[] {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1548.306224] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b4f5d36-f4c1-47ab-a4de-fdbf110d84fd {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1548.314428] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23972967-aee0-4a4b-aac4-4e715332cf53 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1548.347767] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-031f048f-c1e4-44c5-b091-f68cd490d310 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1548.356219] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08e128a7-a7df-42d9-b68f-587dd25818a2 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1548.371245] env[59577]: DEBUG nova.compute.provider_tree [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1548.385561] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1548.405659] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59577) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1548.405879] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.234s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1548.885606] env[59577]: DEBUG nova.network.neutron [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Updating instance_info_cache with network_info: [{"id": "b90aa190-c4a1-44ea-b66b-c4112a20b640", "address": "fa:16:3e:81:c8:a9", "network": {"id": "b8835e2b-b650-40af-adda-0f61cbdb109f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.66", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "5cabf3c5743c484db7095e0ffc0e5d73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8ee9f433-666e-4d74-96df-c7c7a6ac7fda", "external-id": "nsx-vlan-transportzone-499", "segmentation_id": 499, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb90aa190-c4", "ovs_interfaceid": "b90aa190-c4a1-44ea-b66b-c4112a20b640", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1548.904080] env[59577]: DEBUG oslo_concurrency.lockutils [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Releasing lock "refresh_cache-02000be7-32ff-4158-8ce7-02bcefe7d81c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1548.904424] env[59577]: DEBUG nova.compute.manager [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Instance network_info: |[{"id": "b90aa190-c4a1-44ea-b66b-c4112a20b640", "address": "fa:16:3e:81:c8:a9", "network": {"id": "b8835e2b-b650-40af-adda-0f61cbdb109f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.66", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "5cabf3c5743c484db7095e0ffc0e5d73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8ee9f433-666e-4d74-96df-c7c7a6ac7fda", "external-id": "nsx-vlan-transportzone-499", "segmentation_id": 499, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb90aa190-c4", "ovs_interfaceid": "b90aa190-c4a1-44ea-b66b-c4112a20b640", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1548.904786] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:81:c8:a9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '8ee9f433-666e-4d74-96df-c7c7a6ac7fda', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b90aa190-c4a1-44ea-b66b-c4112a20b640', 'vif_model': 'vmxnet3'}] {{(pid=59577) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1548.915329] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Creating folder: Project (1d5ea5daba2d4288821454563ef79c04). Parent ref: group-v398749. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1548.916030] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-aab9c8e9-3861-4771-ae20-83f7d5f0672d {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1548.933252] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Created folder: Project (1d5ea5daba2d4288821454563ef79c04) in parent group-v398749. [ 1548.933252] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Creating folder: Instances. Parent ref: group-v398753. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1548.933252] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ca761a74-818e-4dad-a5a5-0db640757512 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1548.947048] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Created folder: Instances in parent group-v398753. [ 1548.947331] env[59577]: DEBUG oslo.service.loopingcall [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59577) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1548.947564] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Creating VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1548.948517] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-811e259b-9d9e-4bb1-aabc-03777831afef {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1548.973566] env[59577]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1548.973566] env[59577]: value = "task-1933745" [ 1548.973566] env[59577]: _type = "Task" [ 1548.973566] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1548.982593] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933745, 'name': CreateVM_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1549.007848] env[59577]: DEBUG nova.network.neutron [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Successfully updated port: 68253d38-c345-48da-b7bd-ca6e649885be {{(pid=59577) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1549.015509] env[59577]: DEBUG oslo_concurrency.lockutils [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Acquiring lock "refresh_cache-3070aef7-432d-4f92-80d5-18efd8cceec3" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1549.015658] env[59577]: DEBUG oslo_concurrency.lockutils [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Acquired lock "refresh_cache-3070aef7-432d-4f92-80d5-18efd8cceec3" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1549.015806] env[59577]: DEBUG nova.network.neutron [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Building network info cache for instance {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1549.122064] env[59577]: DEBUG nova.network.neutron [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Instance cache missing network info. {{(pid=59577) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1549.400709] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1549.485069] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933745, 'name': CreateVM_Task} progress is 99%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1549.506877] env[59577]: DEBUG nova.compute.manager [req-cc0cf25b-b825-417f-bce3-34c01b679352 req-a9dadd08-88c1-4676-b6d9-6d44cf2b5297 service nova] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Received event network-vif-plugged-b90aa190-c4a1-44ea-b66b-c4112a20b640 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1549.507522] env[59577]: DEBUG oslo_concurrency.lockutils [req-cc0cf25b-b825-417f-bce3-34c01b679352 req-a9dadd08-88c1-4676-b6d9-6d44cf2b5297 service nova] Acquiring lock "02000be7-32ff-4158-8ce7-02bcefe7d81c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1549.507522] env[59577]: DEBUG oslo_concurrency.lockutils [req-cc0cf25b-b825-417f-bce3-34c01b679352 req-a9dadd08-88c1-4676-b6d9-6d44cf2b5297 service nova] Lock "02000be7-32ff-4158-8ce7-02bcefe7d81c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1549.507606] env[59577]: DEBUG oslo_concurrency.lockutils [req-cc0cf25b-b825-417f-bce3-34c01b679352 req-a9dadd08-88c1-4676-b6d9-6d44cf2b5297 service nova] Lock "02000be7-32ff-4158-8ce7-02bcefe7d81c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1549.507810] env[59577]: DEBUG nova.compute.manager [req-cc0cf25b-b825-417f-bce3-34c01b679352 req-a9dadd08-88c1-4676-b6d9-6d44cf2b5297 service nova] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] No waiting events found dispatching network-vif-plugged-b90aa190-c4a1-44ea-b66b-c4112a20b640 {{(pid=59577) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1549.507974] env[59577]: WARNING nova.compute.manager [req-cc0cf25b-b825-417f-bce3-34c01b679352 req-a9dadd08-88c1-4676-b6d9-6d44cf2b5297 service nova] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Received unexpected event network-vif-plugged-b90aa190-c4a1-44ea-b66b-c4112a20b640 for instance with vm_state building and task_state spawning. [ 1549.781249] env[59577]: DEBUG nova.network.neutron [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Updating instance_info_cache with network_info: [{"id": "68253d38-c345-48da-b7bd-ca6e649885be", "address": "fa:16:3e:ba:e5:43", "network": {"id": "108f91c0-5832-465b-834c-43e2fad260ee", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-662716491-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2e9ca370f23242c8931c1ddcd23b4d58", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2ee43879-c0b2-47f7-80d0-2c86e3d6d8b5", "external-id": "nsx-vlan-transportzone-151", "segmentation_id": 151, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap68253d38-c3", "ovs_interfaceid": "68253d38-c345-48da-b7bd-ca6e649885be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1549.804422] env[59577]: DEBUG oslo_concurrency.lockutils [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Releasing lock "refresh_cache-3070aef7-432d-4f92-80d5-18efd8cceec3" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1549.808020] env[59577]: DEBUG nova.compute.manager [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Instance network_info: |[{"id": "68253d38-c345-48da-b7bd-ca6e649885be", "address": "fa:16:3e:ba:e5:43", "network": {"id": "108f91c0-5832-465b-834c-43e2fad260ee", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-662716491-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2e9ca370f23242c8931c1ddcd23b4d58", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2ee43879-c0b2-47f7-80d0-2c86e3d6d8b5", "external-id": "nsx-vlan-transportzone-151", "segmentation_id": 151, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap68253d38-c3", "ovs_interfaceid": "68253d38-c345-48da-b7bd-ca6e649885be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1549.808787] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ba:e5:43', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2ee43879-c0b2-47f7-80d0-2c86e3d6d8b5', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '68253d38-c345-48da-b7bd-ca6e649885be', 'vif_model': 'vmxnet3'}] {{(pid=59577) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1549.818445] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Creating folder: Project (2e9ca370f23242c8931c1ddcd23b4d58). Parent ref: group-v398749. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1549.818445] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9ee51035-1351-46c9-a0e7-76fdd0a1fe4f {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1549.829563] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Created folder: Project (2e9ca370f23242c8931c1ddcd23b4d58) in parent group-v398749. [ 1549.829792] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Creating folder: Instances. Parent ref: group-v398756. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1549.830008] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9e2d04db-02af-400a-a0a8-f7a8ec692a0c {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1549.843091] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Created folder: Instances in parent group-v398756. [ 1549.843239] env[59577]: DEBUG oslo.service.loopingcall [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59577) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1549.843814] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Creating VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1549.843814] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e673330c-66ea-4042-97fd-101489c101e8 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1549.867327] env[59577]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1549.867327] env[59577]: value = "task-1933748" [ 1549.867327] env[59577]: _type = "Task" [ 1549.867327] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1549.878556] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933748, 'name': CreateVM_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1549.994028] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933745, 'name': CreateVM_Task, 'duration_secs': 0.527211} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1549.994028] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Created VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1549.994349] env[59577]: DEBUG oslo_concurrency.lockutils [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1549.994509] env[59577]: DEBUG oslo_concurrency.lockutils [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1549.994818] env[59577]: DEBUG oslo_concurrency.lockutils [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1549.995139] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-585a8fd1-16c8-4dc8-99b9-934fece7fb2e {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1550.000790] env[59577]: DEBUG oslo_vmware.api [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Waiting for the task: (returnval){ [ 1550.000790] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]52e54b73-3cbf-6fa4-e0f2-d3ebcf57f6a0" [ 1550.000790] env[59577]: _type = "Task" [ 1550.000790] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1550.011139] env[59577]: DEBUG oslo_vmware.api [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]52e54b73-3cbf-6fa4-e0f2-d3ebcf57f6a0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1550.046146] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1550.384374] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933748, 'name': CreateVM_Task, 'duration_secs': 0.303727} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1550.384652] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Created VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1550.386119] env[59577]: DEBUG oslo_concurrency.lockutils [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1550.516152] env[59577]: DEBUG oslo_concurrency.lockutils [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1550.517328] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Processing image d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1550.517328] env[59577]: DEBUG oslo_concurrency.lockutils [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1550.517328] env[59577]: DEBUG oslo_concurrency.lockutils [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1550.517328] env[59577]: DEBUG oslo_concurrency.lockutils [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1550.517601] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-555fa06c-efb8-4e46-85ee-1c597cf4e6f8 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1550.523584] env[59577]: DEBUG oslo_vmware.api [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Waiting for the task: (returnval){ [ 1550.523584] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]52883adc-7489-6568-f36c-de3d7ea5d871" [ 1550.523584] env[59577]: _type = "Task" [ 1550.523584] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1550.534627] env[59577]: DEBUG oslo_vmware.api [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]52883adc-7489-6568-f36c-de3d7ea5d871, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1551.034972] env[59577]: DEBUG oslo_concurrency.lockutils [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1551.036066] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Processing image d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1551.036066] env[59577]: DEBUG oslo_concurrency.lockutils [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1551.045549] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1551.898407] env[59577]: DEBUG nova.compute.manager [req-6a59493e-8c58-40c7-abfb-4c04a90a5dd7 req-79f695c4-caa3-4243-a29b-b044dd5e6158 service nova] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Received event network-changed-e1e08eb7-7a24-45c9-b981-8c802277ca69 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1551.901230] env[59577]: DEBUG nova.compute.manager [req-6a59493e-8c58-40c7-abfb-4c04a90a5dd7 req-79f695c4-caa3-4243-a29b-b044dd5e6158 service nova] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Refreshing instance network info cache due to event network-changed-e1e08eb7-7a24-45c9-b981-8c802277ca69. {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1551.901319] env[59577]: DEBUG oslo_concurrency.lockutils [req-6a59493e-8c58-40c7-abfb-4c04a90a5dd7 req-79f695c4-caa3-4243-a29b-b044dd5e6158 service nova] Acquiring lock "refresh_cache-80b8cf1b-b426-4d0c-9134-65936817451b" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1551.901465] env[59577]: DEBUG oslo_concurrency.lockutils [req-6a59493e-8c58-40c7-abfb-4c04a90a5dd7 req-79f695c4-caa3-4243-a29b-b044dd5e6158 service nova] Acquired lock "refresh_cache-80b8cf1b-b426-4d0c-9134-65936817451b" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1551.901834] env[59577]: DEBUG nova.network.neutron [req-6a59493e-8c58-40c7-abfb-4c04a90a5dd7 req-79f695c4-caa3-4243-a29b-b044dd5e6158 service nova] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Refreshing network info cache for port e1e08eb7-7a24-45c9-b981-8c802277ca69 {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1553.046786] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1553.048500] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Starting heal instance info cache {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1553.048616] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Rebuilding the list of instances to heal {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1553.063816] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1553.063986] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1553.064142] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1553.064272] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Didn't find any instances for network info cache update. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1553.108190] env[59577]: DEBUG nova.network.neutron [req-6a59493e-8c58-40c7-abfb-4c04a90a5dd7 req-79f695c4-caa3-4243-a29b-b044dd5e6158 service nova] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Updated VIF entry in instance network info cache for port e1e08eb7-7a24-45c9-b981-8c802277ca69. {{(pid=59577) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1553.111669] env[59577]: DEBUG nova.network.neutron [req-6a59493e-8c58-40c7-abfb-4c04a90a5dd7 req-79f695c4-caa3-4243-a29b-b044dd5e6158 service nova] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Updating instance_info_cache with network_info: [{"id": "e1e08eb7-7a24-45c9-b981-8c802277ca69", "address": "fa:16:3e:40:5c:c5", "network": {"id": "b8835e2b-b650-40af-adda-0f61cbdb109f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.191", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "5cabf3c5743c484db7095e0ffc0e5d73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8ee9f433-666e-4d74-96df-c7c7a6ac7fda", "external-id": "nsx-vlan-transportzone-499", "segmentation_id": 499, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape1e08eb7-7a", "ovs_interfaceid": "e1e08eb7-7a24-45c9-b981-8c802277ca69", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1553.127902] env[59577]: DEBUG oslo_concurrency.lockutils [req-6a59493e-8c58-40c7-abfb-4c04a90a5dd7 req-79f695c4-caa3-4243-a29b-b044dd5e6158 service nova] Releasing lock "refresh_cache-80b8cf1b-b426-4d0c-9134-65936817451b" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1553.128168] env[59577]: DEBUG nova.compute.manager [req-6a59493e-8c58-40c7-abfb-4c04a90a5dd7 req-79f695c4-caa3-4243-a29b-b044dd5e6158 service nova] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Received event network-vif-plugged-68253d38-c345-48da-b7bd-ca6e649885be {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1553.128825] env[59577]: DEBUG oslo_concurrency.lockutils [req-6a59493e-8c58-40c7-abfb-4c04a90a5dd7 req-79f695c4-caa3-4243-a29b-b044dd5e6158 service nova] Acquiring lock "3070aef7-432d-4f92-80d5-18efd8cceec3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1553.128825] env[59577]: DEBUG oslo_concurrency.lockutils [req-6a59493e-8c58-40c7-abfb-4c04a90a5dd7 req-79f695c4-caa3-4243-a29b-b044dd5e6158 service nova] Lock "3070aef7-432d-4f92-80d5-18efd8cceec3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1553.128825] env[59577]: DEBUG oslo_concurrency.lockutils [req-6a59493e-8c58-40c7-abfb-4c04a90a5dd7 req-79f695c4-caa3-4243-a29b-b044dd5e6158 service nova] Lock "3070aef7-432d-4f92-80d5-18efd8cceec3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1553.129042] env[59577]: DEBUG nova.compute.manager [req-6a59493e-8c58-40c7-abfb-4c04a90a5dd7 req-79f695c4-caa3-4243-a29b-b044dd5e6158 service nova] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] No waiting events found dispatching network-vif-plugged-68253d38-c345-48da-b7bd-ca6e649885be {{(pid=59577) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1553.129042] env[59577]: WARNING nova.compute.manager [req-6a59493e-8c58-40c7-abfb-4c04a90a5dd7 req-79f695c4-caa3-4243-a29b-b044dd5e6158 service nova] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Received unexpected event network-vif-plugged-68253d38-c345-48da-b7bd-ca6e649885be for instance with vm_state building and task_state spawning. [ 1554.045593] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1554.045593] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Cleaning up deleted instances {{(pid=59577) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11101}} [ 1554.057566] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] There are 0 instances to clean {{(pid=59577) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11110}} [ 1554.505145] env[59577]: DEBUG nova.compute.manager [req-66064676-468e-497f-9cb7-4df0b56dbfb0 req-cfc27962-7dfd-4f92-a86d-3fa6197c2c53 service nova] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Received event network-changed-b90aa190-c4a1-44ea-b66b-c4112a20b640 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1554.505352] env[59577]: DEBUG nova.compute.manager [req-66064676-468e-497f-9cb7-4df0b56dbfb0 req-cfc27962-7dfd-4f92-a86d-3fa6197c2c53 service nova] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Refreshing instance network info cache due to event network-changed-b90aa190-c4a1-44ea-b66b-c4112a20b640. {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1554.506132] env[59577]: DEBUG oslo_concurrency.lockutils [req-66064676-468e-497f-9cb7-4df0b56dbfb0 req-cfc27962-7dfd-4f92-a86d-3fa6197c2c53 service nova] Acquiring lock "refresh_cache-02000be7-32ff-4158-8ce7-02bcefe7d81c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1554.506280] env[59577]: DEBUG oslo_concurrency.lockutils [req-66064676-468e-497f-9cb7-4df0b56dbfb0 req-cfc27962-7dfd-4f92-a86d-3fa6197c2c53 service nova] Acquired lock "refresh_cache-02000be7-32ff-4158-8ce7-02bcefe7d81c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1554.507538] env[59577]: DEBUG nova.network.neutron [req-66064676-468e-497f-9cb7-4df0b56dbfb0 req-cfc27962-7dfd-4f92-a86d-3fa6197c2c53 service nova] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Refreshing network info cache for port b90aa190-c4a1-44ea-b66b-c4112a20b640 {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1555.044751] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1555.766227] env[59577]: DEBUG nova.compute.manager [req-02b2c63c-a5b4-4047-ae7d-d7938519a94f req-9f98457e-47fe-4ff2-8ac4-309fcf3d8c21 service nova] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Received event network-changed-68253d38-c345-48da-b7bd-ca6e649885be {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1555.766520] env[59577]: DEBUG nova.compute.manager [req-02b2c63c-a5b4-4047-ae7d-d7938519a94f req-9f98457e-47fe-4ff2-8ac4-309fcf3d8c21 service nova] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Refreshing instance network info cache due to event network-changed-68253d38-c345-48da-b7bd-ca6e649885be. {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1555.766651] env[59577]: DEBUG oslo_concurrency.lockutils [req-02b2c63c-a5b4-4047-ae7d-d7938519a94f req-9f98457e-47fe-4ff2-8ac4-309fcf3d8c21 service nova] Acquiring lock "refresh_cache-3070aef7-432d-4f92-80d5-18efd8cceec3" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1555.766781] env[59577]: DEBUG oslo_concurrency.lockutils [req-02b2c63c-a5b4-4047-ae7d-d7938519a94f req-9f98457e-47fe-4ff2-8ac4-309fcf3d8c21 service nova] Acquired lock "refresh_cache-3070aef7-432d-4f92-80d5-18efd8cceec3" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1555.766935] env[59577]: DEBUG nova.network.neutron [req-02b2c63c-a5b4-4047-ae7d-d7938519a94f req-9f98457e-47fe-4ff2-8ac4-309fcf3d8c21 service nova] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Refreshing network info cache for port 68253d38-c345-48da-b7bd-ca6e649885be {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1555.994199] env[59577]: DEBUG nova.network.neutron [req-66064676-468e-497f-9cb7-4df0b56dbfb0 req-cfc27962-7dfd-4f92-a86d-3fa6197c2c53 service nova] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Updated VIF entry in instance network info cache for port b90aa190-c4a1-44ea-b66b-c4112a20b640. {{(pid=59577) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1555.996629] env[59577]: DEBUG nova.network.neutron [req-66064676-468e-497f-9cb7-4df0b56dbfb0 req-cfc27962-7dfd-4f92-a86d-3fa6197c2c53 service nova] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Updating instance_info_cache with network_info: [{"id": "b90aa190-c4a1-44ea-b66b-c4112a20b640", "address": "fa:16:3e:81:c8:a9", "network": {"id": "b8835e2b-b650-40af-adda-0f61cbdb109f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.66", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "5cabf3c5743c484db7095e0ffc0e5d73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8ee9f433-666e-4d74-96df-c7c7a6ac7fda", "external-id": "nsx-vlan-transportzone-499", "segmentation_id": 499, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb90aa190-c4", "ovs_interfaceid": "b90aa190-c4a1-44ea-b66b-c4112a20b640", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1556.010079] env[59577]: DEBUG oslo_concurrency.lockutils [req-66064676-468e-497f-9cb7-4df0b56dbfb0 req-cfc27962-7dfd-4f92-a86d-3fa6197c2c53 service nova] Releasing lock "refresh_cache-02000be7-32ff-4158-8ce7-02bcefe7d81c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1556.583241] env[59577]: DEBUG nova.network.neutron [req-02b2c63c-a5b4-4047-ae7d-d7938519a94f req-9f98457e-47fe-4ff2-8ac4-309fcf3d8c21 service nova] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Updated VIF entry in instance network info cache for port 68253d38-c345-48da-b7bd-ca6e649885be. {{(pid=59577) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1556.583604] env[59577]: DEBUG nova.network.neutron [req-02b2c63c-a5b4-4047-ae7d-d7938519a94f req-9f98457e-47fe-4ff2-8ac4-309fcf3d8c21 service nova] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Updating instance_info_cache with network_info: [{"id": "68253d38-c345-48da-b7bd-ca6e649885be", "address": "fa:16:3e:ba:e5:43", "network": {"id": "108f91c0-5832-465b-834c-43e2fad260ee", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-662716491-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2e9ca370f23242c8931c1ddcd23b4d58", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2ee43879-c0b2-47f7-80d0-2c86e3d6d8b5", "external-id": "nsx-vlan-transportzone-151", "segmentation_id": 151, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap68253d38-c3", "ovs_interfaceid": "68253d38-c345-48da-b7bd-ca6e649885be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1556.595133] env[59577]: DEBUG oslo_concurrency.lockutils [req-02b2c63c-a5b4-4047-ae7d-d7938519a94f req-9f98457e-47fe-4ff2-8ac4-309fcf3d8c21 service nova] Releasing lock "refresh_cache-3070aef7-432d-4f92-80d5-18efd8cceec3" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1596.015676] env[59577]: WARNING oslo_vmware.rw_handles [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1596.015676] env[59577]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1596.015676] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1596.015676] env[59577]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1596.015676] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1596.015676] env[59577]: ERROR oslo_vmware.rw_handles response.begin() [ 1596.015676] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1596.015676] env[59577]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1596.015676] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1596.015676] env[59577]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1596.015676] env[59577]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1596.015676] env[59577]: ERROR oslo_vmware.rw_handles [ 1596.015676] env[59577]: DEBUG nova.virt.vmwareapi.images [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Downloaded image file data d5e691af-5903-46f6-a589-e220c4e5798c to vmware_temp/375b5dde-16bc-4556-a155-bea020f32acc/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk on the data store datastore1 {{(pid=59577) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1596.016742] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Caching image {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1596.017038] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Copying Virtual Disk [datastore1] vmware_temp/375b5dde-16bc-4556-a155-bea020f32acc/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk to [datastore1] vmware_temp/375b5dde-16bc-4556-a155-bea020f32acc/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk {{(pid=59577) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1596.018307] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4db6ffdb-747d-4f3a-89f1-4825d5ad7c8e {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1596.030566] env[59577]: DEBUG oslo_vmware.api [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Waiting for the task: (returnval){ [ 1596.030566] env[59577]: value = "task-1933749" [ 1596.030566] env[59577]: _type = "Task" [ 1596.030566] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1596.041901] env[59577]: DEBUG oslo_vmware.api [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Task: {'id': task-1933749, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1596.539850] env[59577]: DEBUG oslo_vmware.exceptions [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Fault InvalidArgument not matched. {{(pid=59577) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1596.540181] env[59577]: DEBUG oslo_concurrency.lockutils [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1596.544131] env[59577]: ERROR nova.compute.manager [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1596.544131] env[59577]: Faults: ['InvalidArgument'] [ 1596.544131] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Traceback (most recent call last): [ 1596.544131] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1596.544131] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] yield resources [ 1596.544131] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1596.544131] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] self.driver.spawn(context, instance, image_meta, [ 1596.544131] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1596.544131] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1596.544131] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1596.544131] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] self._fetch_image_if_missing(context, vi) [ 1596.544131] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1596.544579] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] image_cache(vi, tmp_image_ds_loc) [ 1596.544579] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1596.544579] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] vm_util.copy_virtual_disk( [ 1596.544579] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1596.544579] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] session._wait_for_task(vmdk_copy_task) [ 1596.544579] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1596.544579] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] return self.wait_for_task(task_ref) [ 1596.544579] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1596.544579] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] return evt.wait() [ 1596.544579] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1596.544579] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] result = hub.switch() [ 1596.544579] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1596.544579] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] return self.greenlet.switch() [ 1596.545042] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1596.545042] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] self.f(*self.args, **self.kw) [ 1596.545042] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1596.545042] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] raise exceptions.translate_fault(task_info.error) [ 1596.545042] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1596.545042] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Faults: ['InvalidArgument'] [ 1596.545042] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] [ 1596.545042] env[59577]: INFO nova.compute.manager [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Terminating instance [ 1596.546954] env[59577]: DEBUG oslo_concurrency.lockutils [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1596.547220] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1596.547883] env[59577]: DEBUG nova.compute.manager [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Start destroying the instance on the hypervisor. {{(pid=59577) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1596.548116] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Destroying instance {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1596.548820] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-db924683-fff8-415e-ad2e-017f40151305 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1596.552673] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6cb03ea9-9ce7-4a6a-a40c-a567d0f6d5e9 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1596.568552] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1596.568552] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59577) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1596.570642] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9ca5d6d9-15ba-4163-a0ee-4d10d908a461 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1596.575705] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Unregistering the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1596.577846] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-aed3ba44-3a97-45a3-85cd-355d128027df {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1596.580141] env[59577]: DEBUG oslo_vmware.api [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Waiting for the task: (returnval){ [ 1596.580141] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]5200d2ef-541b-265c-023a-31c76a51e0a7" [ 1596.580141] env[59577]: _type = "Task" [ 1596.580141] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1596.589440] env[59577]: DEBUG oslo_vmware.api [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]5200d2ef-541b-265c-023a-31c76a51e0a7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1596.661450] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Unregistered the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1596.661450] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Deleting contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1596.661450] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Deleting the datastore file [datastore1] 80b8cf1b-b426-4d0c-9134-65936817451b {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1596.661620] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-0a42aea2-d291-4170-ba99-de692781a5d4 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1596.673656] env[59577]: DEBUG oslo_vmware.api [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Waiting for the task: (returnval){ [ 1596.673656] env[59577]: value = "task-1933751" [ 1596.673656] env[59577]: _type = "Task" [ 1596.673656] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1596.686248] env[59577]: DEBUG oslo_vmware.api [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Task: {'id': task-1933751, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1597.090986] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Preparing fetch location {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1597.094019] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Creating directory with path [datastore1] vmware_temp/9ba2c16a-703a-45f4-bd8e-2c1ba4c9ca2d/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1597.094019] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f441ae01-1430-4634-abda-081d8a2a72ed {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1597.107019] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Created directory with path [datastore1] vmware_temp/9ba2c16a-703a-45f4-bd8e-2c1ba4c9ca2d/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1597.107019] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Fetch image to [datastore1] vmware_temp/9ba2c16a-703a-45f4-bd8e-2c1ba4c9ca2d/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1597.107019] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to [datastore1] vmware_temp/9ba2c16a-703a-45f4-bd8e-2c1ba4c9ca2d/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk on the data store datastore1 {{(pid=59577) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1597.107019] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71d17745-f1b4-4b36-a3c2-37b5e5cff54f {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1597.113728] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d297716-e638-45f0-9fd3-24b100f6d3e8 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1597.124452] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d755cadf-825d-4a2c-ae91-086466069335 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1597.162604] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51fabee1-9f08-4648-a6af-29f540b5494e {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1597.173017] env[59577]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c0c2fd1e-6891-4918-9368-d61abf2dad11 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1597.181883] env[59577]: DEBUG oslo_vmware.api [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Task: {'id': task-1933751, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.08144} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1597.182356] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Deleted the datastore file {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1597.186019] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Deleted contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1597.186019] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Instance destroyed {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1597.186019] env[59577]: INFO nova.compute.manager [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Took 0.63 seconds to destroy the instance on the hypervisor. [ 1597.186019] env[59577]: DEBUG nova.compute.claims [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Aborting claim: {{(pid=59577) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1597.186019] env[59577]: DEBUG oslo_concurrency.lockutils [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1597.186359] env[59577]: DEBUG oslo_concurrency.lockutils [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1597.191241] env[59577]: DEBUG nova.virt.vmwareapi.images [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to the data store datastore1 {{(pid=59577) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1597.275492] env[59577]: DEBUG oslo_vmware.rw_handles [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9ba2c16a-703a-45f4-bd8e-2c1ba4c9ca2d/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59577) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1597.341855] env[59577]: DEBUG oslo_vmware.rw_handles [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Completed reading data from the image iterator. {{(pid=59577) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1597.342111] env[59577]: DEBUG oslo_vmware.rw_handles [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9ba2c16a-703a-45f4-bd8e-2c1ba4c9ca2d/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59577) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1597.354728] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c136f290-3932-4b22-aabf-423daa557a25 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1597.363797] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5af6b4ee-e9cc-4739-8ad0-1aa6d46f051d {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1597.396692] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fcae4932-1c33-4af5-8085-58438edc625d {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1597.404554] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc545a77-99cb-4476-9906-0b2df1ee6a7a {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1597.418470] env[59577]: DEBUG nova.compute.provider_tree [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1597.430128] env[59577]: DEBUG nova.scheduler.client.report [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1597.446471] env[59577]: DEBUG oslo_concurrency.lockutils [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.260s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1597.446471] env[59577]: ERROR nova.compute.manager [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1597.446471] env[59577]: Faults: ['InvalidArgument'] [ 1597.446471] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Traceback (most recent call last): [ 1597.446471] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1597.446471] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] self.driver.spawn(context, instance, image_meta, [ 1597.446471] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1597.446471] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1597.446471] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1597.446471] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] self._fetch_image_if_missing(context, vi) [ 1597.446942] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1597.446942] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] image_cache(vi, tmp_image_ds_loc) [ 1597.446942] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1597.446942] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] vm_util.copy_virtual_disk( [ 1597.446942] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1597.446942] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] session._wait_for_task(vmdk_copy_task) [ 1597.446942] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1597.446942] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] return self.wait_for_task(task_ref) [ 1597.446942] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1597.446942] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] return evt.wait() [ 1597.446942] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1597.446942] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] result = hub.switch() [ 1597.446942] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1597.447484] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] return self.greenlet.switch() [ 1597.447484] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1597.447484] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] self.f(*self.args, **self.kw) [ 1597.447484] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1597.447484] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] raise exceptions.translate_fault(task_info.error) [ 1597.447484] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1597.447484] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Faults: ['InvalidArgument'] [ 1597.447484] env[59577]: ERROR nova.compute.manager [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] [ 1597.447484] env[59577]: DEBUG nova.compute.utils [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] VimFaultException {{(pid=59577) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1597.450019] env[59577]: DEBUG nova.compute.manager [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Build of instance 80b8cf1b-b426-4d0c-9134-65936817451b was re-scheduled: A specified parameter was not correct: fileType [ 1597.450019] env[59577]: Faults: ['InvalidArgument'] {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1597.450417] env[59577]: DEBUG nova.compute.manager [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Unplugging VIFs for instance {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1597.450587] env[59577]: DEBUG nova.compute.manager [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1597.450733] env[59577]: DEBUG nova.compute.manager [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Deallocating network for instance {{(pid=59577) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1597.450896] env[59577]: DEBUG nova.network.neutron [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] deallocate_for_instance() {{(pid=59577) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1598.625433] env[59577]: DEBUG nova.network.neutron [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Updating instance_info_cache with network_info: [] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1598.625433] env[59577]: INFO nova.compute.manager [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 80b8cf1b-b426-4d0c-9134-65936817451b] Took 1.17 seconds to deallocate network for instance. [ 1598.752524] env[59577]: INFO nova.scheduler.client.report [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Deleted allocations for instance 80b8cf1b-b426-4d0c-9134-65936817451b [ 1598.787102] env[59577]: DEBUG oslo_concurrency.lockutils [None req-672c6fa3-4b5f-4926-b9bc-466b6fd56a68 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Lock "80b8cf1b-b426-4d0c-9134-65936817451b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 58.810s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1604.056359] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1605.048298] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1606.044739] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1608.046484] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1608.046771] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59577) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1610.043019] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1610.045141] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1610.059051] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1610.059273] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1610.059444] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1610.059599] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59577) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1610.060875] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a512c1c-7315-43a2-bcdf-37e0dedb3cd3 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1610.075195] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ed9bfaa-9cb9-4a59-a425-47a363ad4cf4 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1610.092749] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a844afa-6795-49f3-9888-286a07dd6dc2 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1610.100246] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83054dd4-2867-4da6-a227-339cfc2ecede {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1610.136947] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181332MB free_disk=175GB free_vcpus=48 pci_devices=None {{(pid=59577) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1610.137123] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1610.137415] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1610.283751] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 02000be7-32ff-4158-8ce7-02bcefe7d81c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1610.285147] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 3070aef7-432d-4f92-80d5-18efd8cceec3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1610.285147] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Total usable vcpus: 48, total allocated vcpus: 2 {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1610.285147] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=768MB phys_disk=200GB used_disk=2GB total_vcpus=48 used_vcpus=2 pci_stats=[] {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1610.308901] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Refreshing inventories for resource provider cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1610.329628] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Updating ProviderTree inventory for provider cbad7164-1dca-4b60-b95b-712603801988 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1610.329628] env[59577]: DEBUG nova.compute.provider_tree [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Updating inventory in ProviderTree for provider cbad7164-1dca-4b60-b95b-712603801988 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1610.340690] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Refreshing aggregate associations for resource provider cbad7164-1dca-4b60-b95b-712603801988, aggregates: 804d52ad-c035-42df-bc65-68931e334a13 {{(pid=59577) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1610.361323] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Refreshing trait associations for resource provider cbad7164-1dca-4b60-b95b-712603801988, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE {{(pid=59577) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1610.430027] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44bd6252-fe1d-4ca4-a19f-a27e1d648756 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1610.437962] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f94e90e6-b7b6-44b7-b406-519bd9dc9fb5 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1610.471755] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-094a8402-9eaa-46f8-ab56-36f60861ed05 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1610.479395] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2854267-7d58-4b84-94e7-81600f7d4b7e {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1610.493163] env[59577]: DEBUG nova.compute.provider_tree [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1610.506520] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1610.527028] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59577) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1610.527028] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.389s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1610.555807] env[59577]: DEBUG oslo_concurrency.lockutils [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Acquiring lock "6855f7a9-0dc0-41e1-900b-112181064d7d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1610.556210] env[59577]: DEBUG oslo_concurrency.lockutils [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Lock "6855f7a9-0dc0-41e1-900b-112181064d7d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1610.571512] env[59577]: DEBUG nova.compute.manager [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Starting instance... {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1610.623020] env[59577]: DEBUG oslo_concurrency.lockutils [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1610.623020] env[59577]: DEBUG oslo_concurrency.lockutils [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1610.624222] env[59577]: INFO nova.compute.claims [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1610.745135] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed98cdef-e4b5-4471-9c43-d1f7a55b0de8 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1610.752868] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62de410a-4bdd-4a21-af05-853649ebdd7c {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1610.791168] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e607079-a088-48ce-9912-98cd23e5e7f0 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1610.798792] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-656fef10-8ba2-4c7d-806d-0df592989774 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1610.811944] env[59577]: DEBUG nova.compute.provider_tree [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1610.822687] env[59577]: DEBUG nova.scheduler.client.report [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1610.838247] env[59577]: DEBUG oslo_concurrency.lockutils [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.218s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1610.838747] env[59577]: DEBUG nova.compute.manager [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Start building networks asynchronously for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1610.873362] env[59577]: DEBUG nova.compute.utils [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Using /dev/sd instead of None {{(pid=59577) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1610.878089] env[59577]: DEBUG nova.compute.manager [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Allocating IP information in the background. {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1610.878089] env[59577]: DEBUG nova.network.neutron [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] allocate_for_instance() {{(pid=59577) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1610.883400] env[59577]: DEBUG nova.compute.manager [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Start building block device mappings for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1610.969735] env[59577]: DEBUG nova.compute.manager [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Start spawning the instance on the hypervisor. {{(pid=59577) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1610.995148] env[59577]: DEBUG nova.virt.hardware [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-12-17T13:42:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-12-17T13:42:33Z,direct_url=,disk_format='vmdk',id=d5e691af-5903-46f6-a589-e220c4e5798c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5cabf3c5743c484db7095e0ffc0e5d73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-12-17T13:42:34Z,virtual_size=,visibility=), allow threads: False {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1610.995148] env[59577]: DEBUG nova.virt.hardware [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Flavor limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1610.995148] env[59577]: DEBUG nova.virt.hardware [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Image limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1610.995661] env[59577]: DEBUG nova.virt.hardware [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Flavor pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1610.995661] env[59577]: DEBUG nova.virt.hardware [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Image pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1610.996037] env[59577]: DEBUG nova.virt.hardware [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1610.996373] env[59577]: DEBUG nova.virt.hardware [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1610.996636] env[59577]: DEBUG nova.virt.hardware [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1610.996912] env[59577]: DEBUG nova.virt.hardware [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Got 1 possible topologies {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1610.997189] env[59577]: DEBUG nova.virt.hardware [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1610.997497] env[59577]: DEBUG nova.virt.hardware [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1610.998435] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9698d673-2398-45b2-8e0e-35cfda2cd6ab {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1611.008083] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac2867f8-7d46-47b1-94a1-5b4b4af4fd9b {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1611.467957] env[59577]: DEBUG nova.policy [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f0edf02bbc614e6eac34fec21f4610ef', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0c9ad1cc1dd840f5b6a9ba5483856d93', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59577) authorize /opt/stack/nova/nova/policy.py:203}} [ 1611.526952] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1612.045648] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1612.909053] env[59577]: DEBUG oslo_concurrency.lockutils [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Acquiring lock "145736d0-0f14-4ec3-ada2-ddb1fe6f271a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1612.909360] env[59577]: DEBUG oslo_concurrency.lockutils [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Lock "145736d0-0f14-4ec3-ada2-ddb1fe6f271a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1612.932565] env[59577]: DEBUG nova.compute.manager [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Starting instance... {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1613.001948] env[59577]: DEBUG oslo_concurrency.lockutils [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1613.002282] env[59577]: DEBUG oslo_concurrency.lockutils [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1613.004031] env[59577]: INFO nova.compute.claims [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1613.134433] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21a5e005-cbfb-4c73-a8c3-e6da8fe22056 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1613.143030] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5fe0f470-5c00-4986-babe-d4036dd4c262 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1613.180714] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8bd6e21-09e3-4b32-9374-647b9e6a5bf6 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1613.189762] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b649cb31-c4fd-4b16-8edf-021b5f6e9401 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1613.208452] env[59577]: DEBUG nova.compute.provider_tree [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1613.229806] env[59577]: DEBUG nova.scheduler.client.report [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1613.251544] env[59577]: DEBUG oslo_concurrency.lockutils [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.249s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1613.252290] env[59577]: DEBUG nova.compute.manager [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Start building networks asynchronously for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1613.315135] env[59577]: DEBUG nova.compute.utils [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Using /dev/sd instead of None {{(pid=59577) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1613.317972] env[59577]: DEBUG nova.compute.manager [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Allocating IP information in the background. {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1613.317972] env[59577]: DEBUG nova.network.neutron [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] allocate_for_instance() {{(pid=59577) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1613.345878] env[59577]: DEBUG nova.compute.manager [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Start building block device mappings for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1613.450721] env[59577]: DEBUG nova.compute.manager [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Start spawning the instance on the hypervisor. {{(pid=59577) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1613.485982] env[59577]: DEBUG nova.virt.hardware [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-12-17T13:59:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3b473e13-494e-446c-a609-9f10f1dabe69',id=31,is_public=True,memory_mb=128,name='tempest-test_resize_flavor_-1187939897',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-12-17T13:42:33Z,direct_url=,disk_format='vmdk',id=d5e691af-5903-46f6-a589-e220c4e5798c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5cabf3c5743c484db7095e0ffc0e5d73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-12-17T13:42:34Z,virtual_size=,visibility=), allow threads: False {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1613.485982] env[59577]: DEBUG nova.virt.hardware [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Flavor limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1613.486144] env[59577]: DEBUG nova.virt.hardware [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Image limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1613.486313] env[59577]: DEBUG nova.virt.hardware [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Flavor pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1613.486453] env[59577]: DEBUG nova.virt.hardware [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Image pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1613.486593] env[59577]: DEBUG nova.virt.hardware [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1613.486795] env[59577]: DEBUG nova.virt.hardware [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1613.486944] env[59577]: DEBUG nova.virt.hardware [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1613.487502] env[59577]: DEBUG nova.virt.hardware [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Got 1 possible topologies {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1613.488211] env[59577]: DEBUG nova.virt.hardware [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1613.488509] env[59577]: DEBUG nova.virt.hardware [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1613.490167] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5787d2f1-6f6b-48f9-8f7e-acdc901a4c5d {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1613.500279] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-585b2cbb-2c36-4b16-91db-c53b6f2b6ab1 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1614.389082] env[59577]: DEBUG nova.policy [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '640391b57c2c4e56a85281dff7d3c18c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4aaf7c8faae24c8c875d202131afdcad', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59577) authorize /opt/stack/nova/nova/policy.py:203}} [ 1614.476232] env[59577]: DEBUG nova.network.neutron [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Successfully created port: 13ae991a-32eb-48d3-8c1f-8a46e9822d37 {{(pid=59577) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1615.044511] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1615.044876] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Starting heal instance info cache {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1615.045442] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Rebuilding the list of instances to heal {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1615.069309] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1615.069309] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1615.070857] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1615.070857] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1615.070857] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Didn't find any instances for network info cache update. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1616.066031] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1616.244358] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Acquiring lock "cc3276aa-0d5a-4a14-ae90-e20a1b823bd3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1616.244878] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Lock "cc3276aa-0d5a-4a14-ae90-e20a1b823bd3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1616.255868] env[59577]: DEBUG nova.compute.manager [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Starting instance... {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1616.316066] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1616.316066] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1616.317321] env[59577]: INFO nova.compute.claims [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1616.475816] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4facabc2-f239-4234-8877-68863f0579d9 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1616.483917] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aabf18f0-3801-4757-a418-8558082cbbee {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1616.515226] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9030492-fc56-448a-ba6a-761c9d0def7e {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1616.523025] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ec8b64d-47cd-4293-9a8e-048998ba91c7 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1616.536158] env[59577]: DEBUG nova.compute.provider_tree [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1616.550152] env[59577]: DEBUG nova.scheduler.client.report [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1616.574753] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.260s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1616.575437] env[59577]: DEBUG nova.compute.manager [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Start building networks asynchronously for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1616.611798] env[59577]: DEBUG nova.compute.utils [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Using /dev/sd instead of None {{(pid=59577) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1616.613272] env[59577]: DEBUG nova.compute.manager [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Allocating IP information in the background. {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1616.613320] env[59577]: DEBUG nova.network.neutron [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] allocate_for_instance() {{(pid=59577) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1616.625997] env[59577]: DEBUG nova.compute.manager [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Start building block device mappings for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1616.703020] env[59577]: DEBUG nova.compute.manager [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Start spawning the instance on the hypervisor. {{(pid=59577) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1616.733583] env[59577]: DEBUG nova.virt.hardware [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-12-17T13:42:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-12-17T13:42:33Z,direct_url=,disk_format='vmdk',id=d5e691af-5903-46f6-a589-e220c4e5798c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5cabf3c5743c484db7095e0ffc0e5d73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-12-17T13:42:34Z,virtual_size=,visibility=), allow threads: False {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1616.733821] env[59577]: DEBUG nova.virt.hardware [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Flavor limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1616.733977] env[59577]: DEBUG nova.virt.hardware [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Image limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1616.734242] env[59577]: DEBUG nova.virt.hardware [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Flavor pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1616.734332] env[59577]: DEBUG nova.virt.hardware [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Image pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1616.734493] env[59577]: DEBUG nova.virt.hardware [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1616.734706] env[59577]: DEBUG nova.virt.hardware [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1616.734853] env[59577]: DEBUG nova.virt.hardware [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1616.735025] env[59577]: DEBUG nova.virt.hardware [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Got 1 possible topologies {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1616.735184] env[59577]: DEBUG nova.virt.hardware [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1616.735367] env[59577]: DEBUG nova.virt.hardware [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1616.736241] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4b3a5a4-7750-4edd-9cbb-3fe9709d32ab {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1616.745469] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-053a423f-86ac-41d6-bde2-2cf77e3105fe {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1616.868870] env[59577]: DEBUG nova.network.neutron [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Successfully created port: f2b81bd4-61db-4133-9366-6daa92e40812 {{(pid=59577) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1617.013544] env[59577]: DEBUG nova.policy [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b4a5096e0cf04efbaea0a2609420bbe9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b175e7e25a3f4a428193fc7782f957d5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59577) authorize /opt/stack/nova/nova/policy.py:203}} [ 1619.378137] env[59577]: DEBUG oslo_concurrency.lockutils [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Acquiring lock "ee50624e-74d6-4afc-9fba-c541f1b83554" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1619.378137] env[59577]: DEBUG oslo_concurrency.lockutils [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Lock "ee50624e-74d6-4afc-9fba-c541f1b83554" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1619.393131] env[59577]: DEBUG nova.compute.manager [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Starting instance... {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1619.458129] env[59577]: DEBUG oslo_concurrency.lockutils [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1619.458129] env[59577]: DEBUG oslo_concurrency.lockutils [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1619.459618] env[59577]: INFO nova.compute.claims [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1619.645422] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e127db09-9e65-4bc8-983c-c32cedab3419 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1619.654708] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a48878a-46e2-4152-b144-c4bdcf187565 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1619.692049] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24b06dd4-f657-421b-a113-b888ea2820e2 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1619.699799] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-905ee728-c9d3-4f58-b65a-80ff10fb6c1f {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1619.714513] env[59577]: DEBUG nova.compute.provider_tree [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1619.724227] env[59577]: DEBUG nova.scheduler.client.report [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1619.744480] env[59577]: DEBUG oslo_concurrency.lockutils [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.286s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1619.744982] env[59577]: DEBUG nova.compute.manager [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Start building networks asynchronously for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1619.767118] env[59577]: DEBUG nova.network.neutron [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Successfully created port: 760c39d4-271c-4e5c-bcb2-27aa69984700 {{(pid=59577) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1619.790687] env[59577]: DEBUG nova.compute.utils [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Using /dev/sd instead of None {{(pid=59577) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1619.791755] env[59577]: DEBUG nova.compute.manager [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Allocating IP information in the background. {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1619.791755] env[59577]: DEBUG nova.network.neutron [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] allocate_for_instance() {{(pid=59577) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1619.803476] env[59577]: DEBUG nova.compute.manager [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Start building block device mappings for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1619.830731] env[59577]: DEBUG nova.network.neutron [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Successfully updated port: 13ae991a-32eb-48d3-8c1f-8a46e9822d37 {{(pid=59577) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1619.844274] env[59577]: DEBUG oslo_concurrency.lockutils [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Acquiring lock "refresh_cache-6855f7a9-0dc0-41e1-900b-112181064d7d" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1619.845692] env[59577]: DEBUG oslo_concurrency.lockutils [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Acquired lock "refresh_cache-6855f7a9-0dc0-41e1-900b-112181064d7d" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1619.846341] env[59577]: DEBUG nova.network.neutron [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Building network info cache for instance {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1619.900029] env[59577]: DEBUG nova.compute.manager [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Start spawning the instance on the hypervisor. {{(pid=59577) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1619.924969] env[59577]: DEBUG nova.virt.hardware [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-12-17T13:42:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-12-17T13:42:33Z,direct_url=,disk_format='vmdk',id=d5e691af-5903-46f6-a589-e220c4e5798c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5cabf3c5743c484db7095e0ffc0e5d73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-12-17T13:42:34Z,virtual_size=,visibility=), allow threads: False {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1619.925291] env[59577]: DEBUG nova.virt.hardware [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Flavor limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1619.925463] env[59577]: DEBUG nova.virt.hardware [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Image limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1619.925649] env[59577]: DEBUG nova.virt.hardware [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Flavor pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1619.925799] env[59577]: DEBUG nova.virt.hardware [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Image pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1619.925946] env[59577]: DEBUG nova.virt.hardware [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1619.926455] env[59577]: DEBUG nova.virt.hardware [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1619.926640] env[59577]: DEBUG nova.virt.hardware [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1619.926813] env[59577]: DEBUG nova.virt.hardware [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Got 1 possible topologies {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1619.927193] env[59577]: DEBUG nova.virt.hardware [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1619.927368] env[59577]: DEBUG nova.virt.hardware [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1619.928288] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d21cad6c-9d50-4847-a6ed-09a63ba2b1f3 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1619.937610] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e93c02a0-2217-4e05-9c07-157efd97dd65 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1620.003848] env[59577]: DEBUG nova.network.neutron [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Instance cache missing network info. {{(pid=59577) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1620.425731] env[59577]: DEBUG nova.policy [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4c0e71cd68ac4403b627df10d4e9a89c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd360139a6dd143dc97c554120e164c52', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59577) authorize /opt/stack/nova/nova/policy.py:203}} [ 1621.201077] env[59577]: DEBUG nova.network.neutron [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Updating instance_info_cache with network_info: [{"id": "13ae991a-32eb-48d3-8c1f-8a46e9822d37", "address": "fa:16:3e:ba:e1:09", "network": {"id": "ed1e671e-cb6d-4b30-b469-e2b0e91786f8", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-483626843-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0c9ad1cc1dd840f5b6a9ba5483856d93", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3f4a795c-8718-4a7c-aafe-9da231df10f8", "external-id": "nsx-vlan-transportzone-162", "segmentation_id": 162, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap13ae991a-32", "ovs_interfaceid": "13ae991a-32eb-48d3-8c1f-8a46e9822d37", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1621.225163] env[59577]: DEBUG oslo_concurrency.lockutils [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Releasing lock "refresh_cache-6855f7a9-0dc0-41e1-900b-112181064d7d" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1621.225721] env[59577]: DEBUG nova.compute.manager [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Instance network_info: |[{"id": "13ae991a-32eb-48d3-8c1f-8a46e9822d37", "address": "fa:16:3e:ba:e1:09", "network": {"id": "ed1e671e-cb6d-4b30-b469-e2b0e91786f8", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-483626843-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0c9ad1cc1dd840f5b6a9ba5483856d93", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3f4a795c-8718-4a7c-aafe-9da231df10f8", "external-id": "nsx-vlan-transportzone-162", "segmentation_id": 162, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap13ae991a-32", "ovs_interfaceid": "13ae991a-32eb-48d3-8c1f-8a46e9822d37", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1621.227086] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ba:e1:09', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3f4a795c-8718-4a7c-aafe-9da231df10f8', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '13ae991a-32eb-48d3-8c1f-8a46e9822d37', 'vif_model': 'vmxnet3'}] {{(pid=59577) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1621.239230] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Creating folder: Project (0c9ad1cc1dd840f5b6a9ba5483856d93). Parent ref: group-v398749. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1621.239987] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1133a248-3b9e-4e64-adb2-f07951c1132e {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1621.265657] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Created folder: Project (0c9ad1cc1dd840f5b6a9ba5483856d93) in parent group-v398749. [ 1621.265657] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Creating folder: Instances. Parent ref: group-v398759. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1621.265657] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ebe341f0-c531-4d99-892a-f0e43009b041 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1621.277448] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Created folder: Instances in parent group-v398759. [ 1621.278339] env[59577]: DEBUG oslo.service.loopingcall [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59577) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1621.278586] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Creating VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1621.278860] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-717951de-f10d-4748-9161-b18fceec31e5 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1621.301718] env[59577]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1621.301718] env[59577]: value = "task-1933754" [ 1621.301718] env[59577]: _type = "Task" [ 1621.301718] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1621.318084] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933754, 'name': CreateVM_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1621.677446] env[59577]: DEBUG nova.compute.manager [req-2a96d277-bc6b-495c-8bc9-3d6c2e6c90a4 req-48e4515b-86a3-4844-a83b-01fe9ad2f26c service nova] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Received event network-vif-plugged-13ae991a-32eb-48d3-8c1f-8a46e9822d37 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1621.677446] env[59577]: DEBUG oslo_concurrency.lockutils [req-2a96d277-bc6b-495c-8bc9-3d6c2e6c90a4 req-48e4515b-86a3-4844-a83b-01fe9ad2f26c service nova] Acquiring lock "6855f7a9-0dc0-41e1-900b-112181064d7d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1621.677446] env[59577]: DEBUG oslo_concurrency.lockutils [req-2a96d277-bc6b-495c-8bc9-3d6c2e6c90a4 req-48e4515b-86a3-4844-a83b-01fe9ad2f26c service nova] Lock "6855f7a9-0dc0-41e1-900b-112181064d7d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1621.677446] env[59577]: DEBUG oslo_concurrency.lockutils [req-2a96d277-bc6b-495c-8bc9-3d6c2e6c90a4 req-48e4515b-86a3-4844-a83b-01fe9ad2f26c service nova] Lock "6855f7a9-0dc0-41e1-900b-112181064d7d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1621.677839] env[59577]: DEBUG nova.compute.manager [req-2a96d277-bc6b-495c-8bc9-3d6c2e6c90a4 req-48e4515b-86a3-4844-a83b-01fe9ad2f26c service nova] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] No waiting events found dispatching network-vif-plugged-13ae991a-32eb-48d3-8c1f-8a46e9822d37 {{(pid=59577) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1621.677839] env[59577]: WARNING nova.compute.manager [req-2a96d277-bc6b-495c-8bc9-3d6c2e6c90a4 req-48e4515b-86a3-4844-a83b-01fe9ad2f26c service nova] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Received unexpected event network-vif-plugged-13ae991a-32eb-48d3-8c1f-8a46e9822d37 for instance with vm_state building and task_state spawning. [ 1621.814196] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933754, 'name': CreateVM_Task, 'duration_secs': 0.328623} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1621.814426] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Created VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1621.815126] env[59577]: DEBUG oslo_concurrency.lockutils [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1621.815339] env[59577]: DEBUG oslo_concurrency.lockutils [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1621.815698] env[59577]: DEBUG oslo_concurrency.lockutils [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1621.815976] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1114cab8-2b6c-4b03-8e63-6a6f0f737026 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1621.820762] env[59577]: DEBUG oslo_vmware.api [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Waiting for the task: (returnval){ [ 1621.820762] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]52d7b9d4-faea-6675-60f3-9d13da727d56" [ 1621.820762] env[59577]: _type = "Task" [ 1621.820762] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1621.829521] env[59577]: DEBUG oslo_vmware.api [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]52d7b9d4-faea-6675-60f3-9d13da727d56, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1622.338589] env[59577]: DEBUG oslo_concurrency.lockutils [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1622.338589] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Processing image d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1622.338589] env[59577]: DEBUG oslo_concurrency.lockutils [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1623.023899] env[59577]: DEBUG nova.network.neutron [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Successfully updated port: f2b81bd4-61db-4133-9366-6daa92e40812 {{(pid=59577) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1623.036551] env[59577]: DEBUG oslo_concurrency.lockutils [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Acquiring lock "refresh_cache-145736d0-0f14-4ec3-ada2-ddb1fe6f271a" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1623.037125] env[59577]: DEBUG oslo_concurrency.lockutils [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Acquired lock "refresh_cache-145736d0-0f14-4ec3-ada2-ddb1fe6f271a" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1623.037295] env[59577]: DEBUG nova.network.neutron [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Building network info cache for instance {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1623.251309] env[59577]: DEBUG nova.network.neutron [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Instance cache missing network info. {{(pid=59577) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1623.623010] env[59577]: DEBUG nova.compute.manager [req-05d5c9ee-76f8-4ba8-88ce-c1d8d12ca12a req-e4fefdd8-9264-41f7-b33f-ddc3d6011957 service nova] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Received event network-vif-plugged-f2b81bd4-61db-4133-9366-6daa92e40812 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1623.623010] env[59577]: DEBUG oslo_concurrency.lockutils [req-05d5c9ee-76f8-4ba8-88ce-c1d8d12ca12a req-e4fefdd8-9264-41f7-b33f-ddc3d6011957 service nova] Acquiring lock "145736d0-0f14-4ec3-ada2-ddb1fe6f271a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1623.623125] env[59577]: DEBUG oslo_concurrency.lockutils [req-05d5c9ee-76f8-4ba8-88ce-c1d8d12ca12a req-e4fefdd8-9264-41f7-b33f-ddc3d6011957 service nova] Lock "145736d0-0f14-4ec3-ada2-ddb1fe6f271a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1623.623344] env[59577]: DEBUG oslo_concurrency.lockutils [req-05d5c9ee-76f8-4ba8-88ce-c1d8d12ca12a req-e4fefdd8-9264-41f7-b33f-ddc3d6011957 service nova] Lock "145736d0-0f14-4ec3-ada2-ddb1fe6f271a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1623.624571] env[59577]: DEBUG nova.compute.manager [req-05d5c9ee-76f8-4ba8-88ce-c1d8d12ca12a req-e4fefdd8-9264-41f7-b33f-ddc3d6011957 service nova] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] No waiting events found dispatching network-vif-plugged-f2b81bd4-61db-4133-9366-6daa92e40812 {{(pid=59577) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1623.624571] env[59577]: WARNING nova.compute.manager [req-05d5c9ee-76f8-4ba8-88ce-c1d8d12ca12a req-e4fefdd8-9264-41f7-b33f-ddc3d6011957 service nova] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Received unexpected event network-vif-plugged-f2b81bd4-61db-4133-9366-6daa92e40812 for instance with vm_state building and task_state spawning. [ 1624.121939] env[59577]: DEBUG nova.network.neutron [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Successfully created port: 99e66f6f-e73f-444b-852e-36b9125498c3 {{(pid=59577) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1625.098703] env[59577]: DEBUG nova.network.neutron [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Updating instance_info_cache with network_info: [{"id": "f2b81bd4-61db-4133-9366-6daa92e40812", "address": "fa:16:3e:66:f0:d2", "network": {"id": "b8835e2b-b650-40af-adda-0f61cbdb109f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.244", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "5cabf3c5743c484db7095e0ffc0e5d73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8ee9f433-666e-4d74-96df-c7c7a6ac7fda", "external-id": "nsx-vlan-transportzone-499", "segmentation_id": 499, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf2b81bd4-61", "ovs_interfaceid": "f2b81bd4-61db-4133-9366-6daa92e40812", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1625.107523] env[59577]: DEBUG oslo_concurrency.lockutils [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Releasing lock "refresh_cache-145736d0-0f14-4ec3-ada2-ddb1fe6f271a" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1625.108845] env[59577]: DEBUG nova.compute.manager [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Instance network_info: |[{"id": "f2b81bd4-61db-4133-9366-6daa92e40812", "address": "fa:16:3e:66:f0:d2", "network": {"id": "b8835e2b-b650-40af-adda-0f61cbdb109f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.244", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "5cabf3c5743c484db7095e0ffc0e5d73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8ee9f433-666e-4d74-96df-c7c7a6ac7fda", "external-id": "nsx-vlan-transportzone-499", "segmentation_id": 499, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf2b81bd4-61", "ovs_interfaceid": "f2b81bd4-61db-4133-9366-6daa92e40812", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1625.109527] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:66:f0:d2', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '8ee9f433-666e-4d74-96df-c7c7a6ac7fda', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'f2b81bd4-61db-4133-9366-6daa92e40812', 'vif_model': 'vmxnet3'}] {{(pid=59577) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1625.120578] env[59577]: DEBUG oslo.service.loopingcall [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59577) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1625.120911] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Creating VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1625.122073] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e182b724-c3ea-483d-a8c3-bab4e503b3b2 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1625.150936] env[59577]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1625.150936] env[59577]: value = "task-1933755" [ 1625.150936] env[59577]: _type = "Task" [ 1625.150936] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1625.160170] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933755, 'name': CreateVM_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1625.665720] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933755, 'name': CreateVM_Task} progress is 99%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1625.673580] env[59577]: DEBUG nova.network.neutron [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Successfully updated port: 760c39d4-271c-4e5c-bcb2-27aa69984700 {{(pid=59577) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1625.683254] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Acquiring lock "refresh_cache-cc3276aa-0d5a-4a14-ae90-e20a1b823bd3" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1625.683444] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Acquired lock "refresh_cache-cc3276aa-0d5a-4a14-ae90-e20a1b823bd3" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1625.683570] env[59577]: DEBUG nova.network.neutron [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Building network info cache for instance {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1625.816300] env[59577]: DEBUG nova.network.neutron [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Instance cache missing network info. {{(pid=59577) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1625.832438] env[59577]: DEBUG oslo_concurrency.lockutils [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Acquiring lock "e9b9f5db-afac-494e-9850-c0d82f26fc68" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1625.832438] env[59577]: DEBUG oslo_concurrency.lockutils [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Lock "e9b9f5db-afac-494e-9850-c0d82f26fc68" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1625.851894] env[59577]: DEBUG nova.compute.manager [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Starting instance... {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1625.929493] env[59577]: DEBUG oslo_concurrency.lockutils [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1625.930446] env[59577]: DEBUG oslo_concurrency.lockutils [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1625.931414] env[59577]: INFO nova.compute.claims [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1626.009673] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Acquiring lock "1a375c37-fcec-4442-827d-103352e81035" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1626.009820] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Lock "1a375c37-fcec-4442-827d-103352e81035" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1626.024559] env[59577]: DEBUG nova.compute.manager [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Starting instance... {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1626.085944] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1626.145121] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b412bfd-b551-4ce6-b029-9a2fe4cef7e3 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1626.156605] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a77f7dd-f7a9-493d-9e81-dae8e9627b6f {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1626.166400] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933755, 'name': CreateVM_Task, 'duration_secs': 0.541367} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1626.190319] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Created VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1626.191268] env[59577]: DEBUG oslo_concurrency.lockutils [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1626.191433] env[59577]: DEBUG oslo_concurrency.lockutils [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1626.191758] env[59577]: DEBUG oslo_concurrency.lockutils [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1626.192559] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb661040-2c52-4222-a016-427550c05a95 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1626.195209] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1b7890b0-a098-4669-8499-3fe7268feb56 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1626.204995] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19a08e45-4873-4d78-9247-dfabe36a171f {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1626.208984] env[59577]: DEBUG oslo_vmware.api [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Waiting for the task: (returnval){ [ 1626.208984] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]52add175-f50c-8c94-aaef-5d986ed4d76f" [ 1626.208984] env[59577]: _type = "Task" [ 1626.208984] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1626.221497] env[59577]: DEBUG nova.compute.provider_tree [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1626.229717] env[59577]: DEBUG oslo_concurrency.lockutils [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1626.229717] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Processing image d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1626.229717] env[59577]: DEBUG oslo_concurrency.lockutils [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1626.236750] env[59577]: DEBUG nova.scheduler.client.report [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1626.258050] env[59577]: DEBUG oslo_concurrency.lockutils [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.328s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1626.258550] env[59577]: DEBUG nova.compute.manager [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Start building networks asynchronously for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1626.260897] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.176s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1626.262400] env[59577]: INFO nova.compute.claims [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1626.305245] env[59577]: DEBUG nova.compute.utils [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Using /dev/sd instead of None {{(pid=59577) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1626.308998] env[59577]: DEBUG nova.compute.manager [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Allocating IP information in the background. {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1626.310025] env[59577]: DEBUG nova.network.neutron [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] allocate_for_instance() {{(pid=59577) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1626.318729] env[59577]: DEBUG nova.compute.manager [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Start building block device mappings for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1626.407954] env[59577]: DEBUG nova.compute.manager [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Start spawning the instance on the hypervisor. {{(pid=59577) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1626.435836] env[59577]: DEBUG nova.virt.hardware [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-12-17T13:42:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-12-17T13:42:33Z,direct_url=,disk_format='vmdk',id=d5e691af-5903-46f6-a589-e220c4e5798c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5cabf3c5743c484db7095e0ffc0e5d73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-12-17T13:42:34Z,virtual_size=,visibility=), allow threads: False {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1626.436115] env[59577]: DEBUG nova.virt.hardware [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Flavor limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1626.436237] env[59577]: DEBUG nova.virt.hardware [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Image limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1626.436416] env[59577]: DEBUG nova.virt.hardware [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Flavor pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1626.436561] env[59577]: DEBUG nova.virt.hardware [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Image pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1626.436704] env[59577]: DEBUG nova.virt.hardware [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1626.436907] env[59577]: DEBUG nova.virt.hardware [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1626.437123] env[59577]: DEBUG nova.virt.hardware [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1626.437231] env[59577]: DEBUG nova.virt.hardware [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Got 1 possible topologies {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1626.437391] env[59577]: DEBUG nova.virt.hardware [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1626.437563] env[59577]: DEBUG nova.virt.hardware [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1626.438459] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8b9efcf-626f-448c-8fc6-854d2a0e56af {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1626.451571] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3f71c36-2bdf-4e9c-89b8-a52c4451810c {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1626.472070] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bef23f89-3e8b-4251-b8ea-679587edb8ca {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1626.481634] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6dd6e83-dc95-4dac-a622-679fac5019c9 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1626.517071] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c6acaa3-a3b6-44da-9f88-b2438408d6c8 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1626.527498] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4bfc721-29a9-4eaa-b95a-2aea9c959e6e {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1626.543264] env[59577]: DEBUG nova.compute.provider_tree [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1626.555225] env[59577]: DEBUG nova.scheduler.client.report [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1626.577849] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.315s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1626.577849] env[59577]: DEBUG nova.compute.manager [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Start building networks asynchronously for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1626.613773] env[59577]: DEBUG nova.compute.utils [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Using /dev/sd instead of None {{(pid=59577) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1626.616677] env[59577]: DEBUG nova.compute.manager [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Allocating IP information in the background. {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1626.616890] env[59577]: DEBUG nova.network.neutron [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] allocate_for_instance() {{(pid=59577) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1626.629419] env[59577]: DEBUG nova.compute.manager [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Start building block device mappings for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1626.709683] env[59577]: DEBUG nova.compute.manager [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Start spawning the instance on the hypervisor. {{(pid=59577) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1626.739019] env[59577]: DEBUG nova.virt.hardware [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-12-17T13:42:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-12-17T13:42:33Z,direct_url=,disk_format='vmdk',id=d5e691af-5903-46f6-a589-e220c4e5798c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5cabf3c5743c484db7095e0ffc0e5d73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-12-17T13:42:34Z,virtual_size=,visibility=), allow threads: False {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1626.739019] env[59577]: DEBUG nova.virt.hardware [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Flavor limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1626.739019] env[59577]: DEBUG nova.virt.hardware [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Image limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1626.739201] env[59577]: DEBUG nova.virt.hardware [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Flavor pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1626.739201] env[59577]: DEBUG nova.virt.hardware [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Image pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1626.739201] env[59577]: DEBUG nova.virt.hardware [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1626.739201] env[59577]: DEBUG nova.virt.hardware [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1626.739398] env[59577]: DEBUG nova.virt.hardware [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1626.739683] env[59577]: DEBUG nova.virt.hardware [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Got 1 possible topologies {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1626.739949] env[59577]: DEBUG nova.virt.hardware [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1626.740249] env[59577]: DEBUG nova.virt.hardware [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1626.742171] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac12a749-6007-4431-a6c3-8bdb167d4f7f {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1626.748030] env[59577]: DEBUG nova.policy [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2f4c33e6694a485d9654a87b991be811', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '68e8db5d904c464d90ceaf914109f393', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59577) authorize /opt/stack/nova/nova/policy.py:203}} [ 1626.753912] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aba30b87-b085-4209-8d83-3af6a041b606 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1626.921751] env[59577]: DEBUG nova.policy [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '75113aa976714d328ca5ffa0848096eb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7dd93200661c419f8ca4baf28c094a99', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59577) authorize /opt/stack/nova/nova/policy.py:203}} [ 1627.552108] env[59577]: DEBUG nova.network.neutron [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Updating instance_info_cache with network_info: [{"id": "760c39d4-271c-4e5c-bcb2-27aa69984700", "address": "fa:16:3e:2d:ee:e6", "network": {"id": "3a10561e-da46-4351-98b3-035fd712ec58", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-6864498-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b175e7e25a3f4a428193fc7782f957d5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2be3fdb5-359e-43bd-8c20-2ff00e81db55", "external-id": "nsx-vlan-transportzone-986", "segmentation_id": 986, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap760c39d4-27", "ovs_interfaceid": "760c39d4-271c-4e5c-bcb2-27aa69984700", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1627.566981] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Releasing lock "refresh_cache-cc3276aa-0d5a-4a14-ae90-e20a1b823bd3" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1627.566981] env[59577]: DEBUG nova.compute.manager [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Instance network_info: |[{"id": "760c39d4-271c-4e5c-bcb2-27aa69984700", "address": "fa:16:3e:2d:ee:e6", "network": {"id": "3a10561e-da46-4351-98b3-035fd712ec58", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-6864498-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b175e7e25a3f4a428193fc7782f957d5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2be3fdb5-359e-43bd-8c20-2ff00e81db55", "external-id": "nsx-vlan-transportzone-986", "segmentation_id": 986, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap760c39d4-27", "ovs_interfaceid": "760c39d4-271c-4e5c-bcb2-27aa69984700", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1627.568409] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:2d:ee:e6', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2be3fdb5-359e-43bd-8c20-2ff00e81db55', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '760c39d4-271c-4e5c-bcb2-27aa69984700', 'vif_model': 'vmxnet3'}] {{(pid=59577) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1627.577925] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Creating folder: Project (b175e7e25a3f4a428193fc7782f957d5). Parent ref: group-v398749. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1627.578780] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f5e0db7d-ecb4-4bec-b092-b4217c000e2f {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1627.592703] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Created folder: Project (b175e7e25a3f4a428193fc7782f957d5) in parent group-v398749. [ 1627.592703] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Creating folder: Instances. Parent ref: group-v398763. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1627.592703] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9972d7bc-d468-4c9f-bd7e-bbe947041e65 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1627.602012] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Created folder: Instances in parent group-v398763. [ 1627.602428] env[59577]: DEBUG oslo.service.loopingcall [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59577) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1627.602678] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Creating VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1627.602905] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-47ba2488-dedc-4a69-bd4d-787eb6fc20c7 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1627.625087] env[59577]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1627.625087] env[59577]: value = "task-1933758" [ 1627.625087] env[59577]: _type = "Task" [ 1627.625087] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1627.634356] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933758, 'name': CreateVM_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1627.698877] env[59577]: DEBUG nova.compute.manager [req-17e0c062-0457-4ae6-85ef-07f73e0beeee req-a0500633-5f5d-4da4-9caa-ad2a9133b46a service nova] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Received event network-changed-13ae991a-32eb-48d3-8c1f-8a46e9822d37 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1627.699108] env[59577]: DEBUG nova.compute.manager [req-17e0c062-0457-4ae6-85ef-07f73e0beeee req-a0500633-5f5d-4da4-9caa-ad2a9133b46a service nova] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Refreshing instance network info cache due to event network-changed-13ae991a-32eb-48d3-8c1f-8a46e9822d37. {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1627.699365] env[59577]: DEBUG oslo_concurrency.lockutils [req-17e0c062-0457-4ae6-85ef-07f73e0beeee req-a0500633-5f5d-4da4-9caa-ad2a9133b46a service nova] Acquiring lock "refresh_cache-6855f7a9-0dc0-41e1-900b-112181064d7d" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1627.699550] env[59577]: DEBUG oslo_concurrency.lockutils [req-17e0c062-0457-4ae6-85ef-07f73e0beeee req-a0500633-5f5d-4da4-9caa-ad2a9133b46a service nova] Acquired lock "refresh_cache-6855f7a9-0dc0-41e1-900b-112181064d7d" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1627.699771] env[59577]: DEBUG nova.network.neutron [req-17e0c062-0457-4ae6-85ef-07f73e0beeee req-a0500633-5f5d-4da4-9caa-ad2a9133b46a service nova] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Refreshing network info cache for port 13ae991a-32eb-48d3-8c1f-8a46e9822d37 {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1628.136470] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933758, 'name': CreateVM_Task} progress is 25%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1628.643252] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933758, 'name': CreateVM_Task, 'duration_secs': 0.653406} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1628.643252] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Created VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1628.643252] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1628.643252] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1628.643252] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1628.643771] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3e44980f-e794-4a63-a9a6-aa385225e366 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1628.654489] env[59577]: DEBUG oslo_vmware.api [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Waiting for the task: (returnval){ [ 1628.654489] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]52035af9-7807-fee0-b396-9919b8f2447f" [ 1628.654489] env[59577]: _type = "Task" [ 1628.654489] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1628.671231] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1628.671231] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Processing image d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1628.671978] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1628.769083] env[59577]: DEBUG nova.network.neutron [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Successfully created port: 9975edaa-cbba-491b-b8a7-ad6fccdcdf24 {{(pid=59577) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1629.253642] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Acquiring lock "47aac36c-3f70-40a8-ab60-cebba86d3f85" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1629.253953] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Lock "47aac36c-3f70-40a8-ab60-cebba86d3f85" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1629.272024] env[59577]: DEBUG nova.compute.manager [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Starting instance... {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1629.288296] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Acquiring lock "b9d0daac-02e6-4862-b3de-64223d5a4a76" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1629.288374] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Lock "b9d0daac-02e6-4862-b3de-64223d5a4a76" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1629.302839] env[59577]: DEBUG nova.compute.manager [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Starting instance... {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1629.360022] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1629.360022] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1629.360022] env[59577]: INFO nova.compute.claims [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1629.389872] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1629.461599] env[59577]: DEBUG nova.network.neutron [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Successfully created port: c73f9fa7-46ad-4abb-b15a-e9ae9c7fe9d3 {{(pid=59577) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1629.575143] env[59577]: DEBUG nova.network.neutron [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Successfully updated port: 99e66f6f-e73f-444b-852e-36b9125498c3 {{(pid=59577) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1629.592342] env[59577]: DEBUG oslo_concurrency.lockutils [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Acquiring lock "refresh_cache-ee50624e-74d6-4afc-9fba-c541f1b83554" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1629.592495] env[59577]: DEBUG oslo_concurrency.lockutils [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Acquired lock "refresh_cache-ee50624e-74d6-4afc-9fba-c541f1b83554" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1629.593150] env[59577]: DEBUG nova.network.neutron [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Building network info cache for instance {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1629.613958] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aaa63391-f76b-4f44-9e68-9d3350748772 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1629.622405] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4621d983-e6fd-45b6-b63d-51e263852bf8 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1629.662025] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-402e2490-1536-498e-8757-a960899c5af8 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1629.671529] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-903c98a7-a9d6-4ad9-96d4-bd35f1ac7bf7 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1629.686438] env[59577]: DEBUG nova.compute.provider_tree [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1629.695648] env[59577]: DEBUG nova.scheduler.client.report [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1629.727790] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.365s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1629.727790] env[59577]: DEBUG nova.compute.manager [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Start building networks asynchronously for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1629.730072] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.340s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1629.731503] env[59577]: INFO nova.compute.claims [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1629.776299] env[59577]: DEBUG nova.compute.utils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Using /dev/sd instead of None {{(pid=59577) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1629.778750] env[59577]: DEBUG nova.compute.manager [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Allocating IP information in the background. {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1629.779026] env[59577]: DEBUG nova.network.neutron [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] allocate_for_instance() {{(pid=59577) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1629.792034] env[59577]: DEBUG nova.compute.manager [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Start building block device mappings for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1629.878150] env[59577]: DEBUG nova.compute.manager [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Start spawning the instance on the hypervisor. {{(pid=59577) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1629.913902] env[59577]: DEBUG nova.virt.hardware [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-12-17T13:42:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-12-17T13:42:33Z,direct_url=,disk_format='vmdk',id=d5e691af-5903-46f6-a589-e220c4e5798c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5cabf3c5743c484db7095e0ffc0e5d73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-12-17T13:42:34Z,virtual_size=,visibility=), allow threads: False {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1629.913902] env[59577]: DEBUG nova.virt.hardware [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Flavor limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1629.913902] env[59577]: DEBUG nova.virt.hardware [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Image limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1629.914065] env[59577]: DEBUG nova.virt.hardware [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Flavor pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1629.914065] env[59577]: DEBUG nova.virt.hardware [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Image pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1629.914065] env[59577]: DEBUG nova.virt.hardware [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1629.914065] env[59577]: DEBUG nova.virt.hardware [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1629.914065] env[59577]: DEBUG nova.virt.hardware [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1629.914459] env[59577]: DEBUG nova.virt.hardware [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Got 1 possible topologies {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1629.914774] env[59577]: DEBUG nova.virt.hardware [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1629.915108] env[59577]: DEBUG nova.virt.hardware [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1629.916947] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e66df5d2-f4c3-4bbc-92c7-fd1cb9f31631 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1629.927208] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e9d37fa-403e-41a7-ac41-f5ea3dc192ef {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1629.981908] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc318d98-2e17-4c7a-b9c8-229af1ab2f12 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1629.990379] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57e14c78-ceaa-41ea-9c1d-0e75a9c7e968 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1630.021093] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48636137-e13b-4252-9343-17df3726fa63 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1630.028183] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d886164-5630-48a9-9a6d-4e898d848be4 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1630.042830] env[59577]: DEBUG nova.compute.provider_tree [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1630.045953] env[59577]: DEBUG nova.network.neutron [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Instance cache missing network info. {{(pid=59577) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1630.056767] env[59577]: DEBUG nova.scheduler.client.report [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1630.072724] env[59577]: DEBUG nova.network.neutron [req-17e0c062-0457-4ae6-85ef-07f73e0beeee req-a0500633-5f5d-4da4-9caa-ad2a9133b46a service nova] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Updated VIF entry in instance network info cache for port 13ae991a-32eb-48d3-8c1f-8a46e9822d37. {{(pid=59577) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1630.072724] env[59577]: DEBUG nova.network.neutron [req-17e0c062-0457-4ae6-85ef-07f73e0beeee req-a0500633-5f5d-4da4-9caa-ad2a9133b46a service nova] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Updating instance_info_cache with network_info: [{"id": "13ae991a-32eb-48d3-8c1f-8a46e9822d37", "address": "fa:16:3e:ba:e1:09", "network": {"id": "ed1e671e-cb6d-4b30-b469-e2b0e91786f8", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-483626843-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0c9ad1cc1dd840f5b6a9ba5483856d93", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3f4a795c-8718-4a7c-aafe-9da231df10f8", "external-id": "nsx-vlan-transportzone-162", "segmentation_id": 162, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap13ae991a-32", "ovs_interfaceid": "13ae991a-32eb-48d3-8c1f-8a46e9822d37", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1630.075416] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.344s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1630.075904] env[59577]: DEBUG nova.compute.manager [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Start building networks asynchronously for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1630.084113] env[59577]: DEBUG oslo_concurrency.lockutils [req-17e0c062-0457-4ae6-85ef-07f73e0beeee req-a0500633-5f5d-4da4-9caa-ad2a9133b46a service nova] Releasing lock "refresh_cache-6855f7a9-0dc0-41e1-900b-112181064d7d" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1630.124395] env[59577]: DEBUG nova.compute.utils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Using /dev/sd instead of None {{(pid=59577) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1630.126171] env[59577]: DEBUG nova.compute.manager [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Allocating IP information in the background. {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1630.126171] env[59577]: DEBUG nova.network.neutron [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] allocate_for_instance() {{(pid=59577) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1630.143416] env[59577]: DEBUG nova.policy [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1c613cf7383a4f65a00846948b4bf1ea', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9b03ba07ab9c413f8895ab62a8379a1c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59577) authorize /opt/stack/nova/nova/policy.py:203}} [ 1630.146347] env[59577]: DEBUG nova.compute.manager [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Start building block device mappings for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1630.195375] env[59577]: DEBUG nova.compute.manager [req-2962b3b5-877c-439d-adea-3d262384bb0e req-abe33d71-c53b-4baa-9f82-1deb1dd9beae service nova] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Received event network-changed-f2b81bd4-61db-4133-9366-6daa92e40812 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1630.198154] env[59577]: DEBUG nova.compute.manager [req-2962b3b5-877c-439d-adea-3d262384bb0e req-abe33d71-c53b-4baa-9f82-1deb1dd9beae service nova] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Refreshing instance network info cache due to event network-changed-f2b81bd4-61db-4133-9366-6daa92e40812. {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1630.198154] env[59577]: DEBUG oslo_concurrency.lockutils [req-2962b3b5-877c-439d-adea-3d262384bb0e req-abe33d71-c53b-4baa-9f82-1deb1dd9beae service nova] Acquiring lock "refresh_cache-145736d0-0f14-4ec3-ada2-ddb1fe6f271a" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1630.198154] env[59577]: DEBUG oslo_concurrency.lockutils [req-2962b3b5-877c-439d-adea-3d262384bb0e req-abe33d71-c53b-4baa-9f82-1deb1dd9beae service nova] Acquired lock "refresh_cache-145736d0-0f14-4ec3-ada2-ddb1fe6f271a" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1630.198154] env[59577]: DEBUG nova.network.neutron [req-2962b3b5-877c-439d-adea-3d262384bb0e req-abe33d71-c53b-4baa-9f82-1deb1dd9beae service nova] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Refreshing network info cache for port f2b81bd4-61db-4133-9366-6daa92e40812 {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1630.241936] env[59577]: DEBUG nova.compute.manager [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Start spawning the instance on the hypervisor. {{(pid=59577) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1630.269770] env[59577]: DEBUG nova.virt.hardware [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-12-17T13:42:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-12-17T13:42:33Z,direct_url=,disk_format='vmdk',id=d5e691af-5903-46f6-a589-e220c4e5798c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5cabf3c5743c484db7095e0ffc0e5d73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-12-17T13:42:34Z,virtual_size=,visibility=), allow threads: False {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1630.272186] env[59577]: DEBUG nova.virt.hardware [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Flavor limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1630.272186] env[59577]: DEBUG nova.virt.hardware [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Image limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1630.272186] env[59577]: DEBUG nova.virt.hardware [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Flavor pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1630.272186] env[59577]: DEBUG nova.virt.hardware [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Image pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1630.272186] env[59577]: DEBUG nova.virt.hardware [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1630.272433] env[59577]: DEBUG nova.virt.hardware [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1630.272433] env[59577]: DEBUG nova.virt.hardware [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1630.272433] env[59577]: DEBUG nova.virt.hardware [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Got 1 possible topologies {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1630.272433] env[59577]: DEBUG nova.virt.hardware [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1630.272433] env[59577]: DEBUG nova.virt.hardware [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1630.274884] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39861d53-58a4-4b37-ace1-367a88de1ba9 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1630.285946] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81fd33c2-9eec-4248-9373-501c597324e0 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1630.474068] env[59577]: DEBUG nova.policy [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1c613cf7383a4f65a00846948b4bf1ea', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9b03ba07ab9c413f8895ab62a8379a1c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59577) authorize /opt/stack/nova/nova/policy.py:203}} [ 1631.148883] env[59577]: DEBUG nova.network.neutron [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Successfully created port: fdc91f1d-4112-4f4c-9fd8-c4c5f35a61ae {{(pid=59577) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1631.279989] env[59577]: DEBUG nova.network.neutron [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Updating instance_info_cache with network_info: [{"id": "99e66f6f-e73f-444b-852e-36b9125498c3", "address": "fa:16:3e:34:9e:96", "network": {"id": "d46458c9-9414-4916-b34e-8b5535846531", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1112556393-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d360139a6dd143dc97c554120e164c52", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ba4f6497-e2b4-43b5-9819-6927865ae974", "external-id": "nsx-vlan-transportzone-112", "segmentation_id": 112, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap99e66f6f-e7", "ovs_interfaceid": "99e66f6f-e73f-444b-852e-36b9125498c3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1631.297501] env[59577]: DEBUG oslo_concurrency.lockutils [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Releasing lock "refresh_cache-ee50624e-74d6-4afc-9fba-c541f1b83554" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1631.297818] env[59577]: DEBUG nova.compute.manager [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Instance network_info: |[{"id": "99e66f6f-e73f-444b-852e-36b9125498c3", "address": "fa:16:3e:34:9e:96", "network": {"id": "d46458c9-9414-4916-b34e-8b5535846531", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1112556393-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d360139a6dd143dc97c554120e164c52", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ba4f6497-e2b4-43b5-9819-6927865ae974", "external-id": "nsx-vlan-transportzone-112", "segmentation_id": 112, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap99e66f6f-e7", "ovs_interfaceid": "99e66f6f-e73f-444b-852e-36b9125498c3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1631.298204] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:34:9e:96', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ba4f6497-e2b4-43b5-9819-6927865ae974', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '99e66f6f-e73f-444b-852e-36b9125498c3', 'vif_model': 'vmxnet3'}] {{(pid=59577) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1631.306900] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Creating folder: Project (d360139a6dd143dc97c554120e164c52). Parent ref: group-v398749. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1631.307438] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-00bcdbd5-9f87-408c-8f25-89787f4e9af2 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1631.318878] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Created folder: Project (d360139a6dd143dc97c554120e164c52) in parent group-v398749. [ 1631.319096] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Creating folder: Instances. Parent ref: group-v398766. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1631.319338] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2ec10710-cd26-40b4-8ee1-99d7bc6e8dc6 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1631.329340] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Created folder: Instances in parent group-v398766. [ 1631.329545] env[59577]: DEBUG oslo.service.loopingcall [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59577) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1631.329740] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Creating VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1631.329941] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e620817c-ae64-407e-9ee9-2876bdcce29b {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1631.352251] env[59577]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1631.352251] env[59577]: value = "task-1933761" [ 1631.352251] env[59577]: _type = "Task" [ 1631.352251] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1631.359542] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933761, 'name': CreateVM_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1631.776284] env[59577]: DEBUG nova.network.neutron [req-2962b3b5-877c-439d-adea-3d262384bb0e req-abe33d71-c53b-4baa-9f82-1deb1dd9beae service nova] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Updated VIF entry in instance network info cache for port f2b81bd4-61db-4133-9366-6daa92e40812. {{(pid=59577) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1631.776610] env[59577]: DEBUG nova.network.neutron [req-2962b3b5-877c-439d-adea-3d262384bb0e req-abe33d71-c53b-4baa-9f82-1deb1dd9beae service nova] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Updating instance_info_cache with network_info: [{"id": "f2b81bd4-61db-4133-9366-6daa92e40812", "address": "fa:16:3e:66:f0:d2", "network": {"id": "b8835e2b-b650-40af-adda-0f61cbdb109f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.244", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "5cabf3c5743c484db7095e0ffc0e5d73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8ee9f433-666e-4d74-96df-c7c7a6ac7fda", "external-id": "nsx-vlan-transportzone-499", "segmentation_id": 499, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf2b81bd4-61", "ovs_interfaceid": "f2b81bd4-61db-4133-9366-6daa92e40812", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1631.788346] env[59577]: DEBUG oslo_concurrency.lockutils [req-2962b3b5-877c-439d-adea-3d262384bb0e req-abe33d71-c53b-4baa-9f82-1deb1dd9beae service nova] Releasing lock "refresh_cache-145736d0-0f14-4ec3-ada2-ddb1fe6f271a" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1631.789411] env[59577]: DEBUG nova.compute.manager [req-2962b3b5-877c-439d-adea-3d262384bb0e req-abe33d71-c53b-4baa-9f82-1deb1dd9beae service nova] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Received event network-vif-plugged-760c39d4-271c-4e5c-bcb2-27aa69984700 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1631.789411] env[59577]: DEBUG oslo_concurrency.lockutils [req-2962b3b5-877c-439d-adea-3d262384bb0e req-abe33d71-c53b-4baa-9f82-1deb1dd9beae service nova] Acquiring lock "cc3276aa-0d5a-4a14-ae90-e20a1b823bd3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1631.789411] env[59577]: DEBUG oslo_concurrency.lockutils [req-2962b3b5-877c-439d-adea-3d262384bb0e req-abe33d71-c53b-4baa-9f82-1deb1dd9beae service nova] Lock "cc3276aa-0d5a-4a14-ae90-e20a1b823bd3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1631.789411] env[59577]: DEBUG oslo_concurrency.lockutils [req-2962b3b5-877c-439d-adea-3d262384bb0e req-abe33d71-c53b-4baa-9f82-1deb1dd9beae service nova] Lock "cc3276aa-0d5a-4a14-ae90-e20a1b823bd3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1631.789601] env[59577]: DEBUG nova.compute.manager [req-2962b3b5-877c-439d-adea-3d262384bb0e req-abe33d71-c53b-4baa-9f82-1deb1dd9beae service nova] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] No waiting events found dispatching network-vif-plugged-760c39d4-271c-4e5c-bcb2-27aa69984700 {{(pid=59577) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1631.789601] env[59577]: WARNING nova.compute.manager [req-2962b3b5-877c-439d-adea-3d262384bb0e req-abe33d71-c53b-4baa-9f82-1deb1dd9beae service nova] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Received unexpected event network-vif-plugged-760c39d4-271c-4e5c-bcb2-27aa69984700 for instance with vm_state building and task_state spawning. [ 1631.863744] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933761, 'name': CreateVM_Task, 'duration_secs': 0.361617} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1631.864420] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Created VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1631.865621] env[59577]: DEBUG oslo_concurrency.lockutils [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1631.865866] env[59577]: DEBUG oslo_concurrency.lockutils [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1631.866207] env[59577]: DEBUG oslo_concurrency.lockutils [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1631.866554] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7a32f766-e4c3-49bf-8f3c-5ebf095d925e {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1631.871760] env[59577]: DEBUG oslo_vmware.api [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Waiting for the task: (returnval){ [ 1631.871760] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]52db9518-6ba2-52ef-d00e-10aa14f8a30a" [ 1631.871760] env[59577]: _type = "Task" [ 1631.871760] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1631.884172] env[59577]: DEBUG oslo_vmware.api [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]52db9518-6ba2-52ef-d00e-10aa14f8a30a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1631.895892] env[59577]: DEBUG nova.network.neutron [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Successfully updated port: 9975edaa-cbba-491b-b8a7-ad6fccdcdf24 {{(pid=59577) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1631.909972] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Acquiring lock "refresh_cache-1a375c37-fcec-4442-827d-103352e81035" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1631.910252] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Acquired lock "refresh_cache-1a375c37-fcec-4442-827d-103352e81035" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1631.914448] env[59577]: DEBUG nova.network.neutron [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Building network info cache for instance {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1631.999356] env[59577]: DEBUG nova.network.neutron [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Instance cache missing network info. {{(pid=59577) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1632.121301] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Acquiring lock "e7945a83-b063-42c4-9991-7f1e0545361d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1632.122173] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Lock "e7945a83-b063-42c4-9991-7f1e0545361d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1632.387017] env[59577]: DEBUG oslo_concurrency.lockutils [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1632.389058] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Processing image d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1632.389058] env[59577]: DEBUG oslo_concurrency.lockutils [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1632.466019] env[59577]: DEBUG nova.network.neutron [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Updating instance_info_cache with network_info: [{"id": "9975edaa-cbba-491b-b8a7-ad6fccdcdf24", "address": "fa:16:3e:b4:8b:fa", "network": {"id": "660f3ea5-92a2-4c13-bda2-14d48e0151bb", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1064616051-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7dd93200661c419f8ca4baf28c094a99", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "32faf59b-014c-4f1f-8331-40df95bf741f", "external-id": "nsx-vlan-transportzone-996", "segmentation_id": 996, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9975edaa-cb", "ovs_interfaceid": "9975edaa-cbba-491b-b8a7-ad6fccdcdf24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1632.471690] env[59577]: DEBUG nova.network.neutron [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Successfully created port: b8edabde-eff3-4547-b3c3-53d0e44af941 {{(pid=59577) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1632.499468] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Releasing lock "refresh_cache-1a375c37-fcec-4442-827d-103352e81035" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1632.499907] env[59577]: DEBUG nova.compute.manager [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Instance network_info: |[{"id": "9975edaa-cbba-491b-b8a7-ad6fccdcdf24", "address": "fa:16:3e:b4:8b:fa", "network": {"id": "660f3ea5-92a2-4c13-bda2-14d48e0151bb", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1064616051-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7dd93200661c419f8ca4baf28c094a99", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "32faf59b-014c-4f1f-8331-40df95bf741f", "external-id": "nsx-vlan-transportzone-996", "segmentation_id": 996, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9975edaa-cb", "ovs_interfaceid": "9975edaa-cbba-491b-b8a7-ad6fccdcdf24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1632.500543] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b4:8b:fa', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '32faf59b-014c-4f1f-8331-40df95bf741f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '9975edaa-cbba-491b-b8a7-ad6fccdcdf24', 'vif_model': 'vmxnet3'}] {{(pid=59577) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1632.513790] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Creating folder: Project (7dd93200661c419f8ca4baf28c094a99). Parent ref: group-v398749. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1632.514656] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-dc6d00c0-7349-44a7-8312-51e27e41fe1a {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1632.525511] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Created folder: Project (7dd93200661c419f8ca4baf28c094a99) in parent group-v398749. [ 1632.525702] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Creating folder: Instances. Parent ref: group-v398769. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1632.525939] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-75a7639c-ccb6-4881-b58b-5be4141cbc84 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1632.539138] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Created folder: Instances in parent group-v398769. [ 1632.539224] env[59577]: DEBUG oslo.service.loopingcall [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59577) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1632.539385] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1a375c37-fcec-4442-827d-103352e81035] Creating VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1632.539591] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-5aa51bcd-ca6a-47ce-bb7c-e794c26987f0 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1632.563051] env[59577]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1632.563051] env[59577]: value = "task-1933764" [ 1632.563051] env[59577]: _type = "Task" [ 1632.563051] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1632.570942] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933764, 'name': CreateVM_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1632.737743] env[59577]: DEBUG nova.network.neutron [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Successfully updated port: fdc91f1d-4112-4f4c-9fd8-c4c5f35a61ae {{(pid=59577) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1632.750218] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Acquiring lock "refresh_cache-47aac36c-3f70-40a8-ab60-cebba86d3f85" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1632.750541] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Acquired lock "refresh_cache-47aac36c-3f70-40a8-ab60-cebba86d3f85" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1632.750714] env[59577]: DEBUG nova.network.neutron [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Building network info cache for instance {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1632.815191] env[59577]: DEBUG nova.network.neutron [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Instance cache missing network info. {{(pid=59577) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1633.079019] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933764, 'name': CreateVM_Task} progress is 25%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1633.267997] env[59577]: DEBUG nova.network.neutron [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Successfully updated port: c73f9fa7-46ad-4abb-b15a-e9ae9c7fe9d3 {{(pid=59577) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1633.277318] env[59577]: DEBUG oslo_concurrency.lockutils [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Acquiring lock "refresh_cache-e9b9f5db-afac-494e-9850-c0d82f26fc68" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1633.277474] env[59577]: DEBUG oslo_concurrency.lockutils [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Acquired lock "refresh_cache-e9b9f5db-afac-494e-9850-c0d82f26fc68" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1633.277825] env[59577]: DEBUG nova.network.neutron [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Building network info cache for instance {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1633.385259] env[59577]: DEBUG nova.network.neutron [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Instance cache missing network info. {{(pid=59577) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1633.410775] env[59577]: DEBUG nova.network.neutron [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Updating instance_info_cache with network_info: [{"id": "fdc91f1d-4112-4f4c-9fd8-c4c5f35a61ae", "address": "fa:16:3e:0d:bc:e3", "network": {"id": "cdcb4580-55bf-4bf1-8ab7-1a73dbc4d42a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-2094093215-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9b03ba07ab9c413f8895ab62a8379a1c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0c293d47-74c0-49d7-a474-cdb643080f6f", "external-id": "nsx-vlan-transportzone-172", "segmentation_id": 172, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfdc91f1d-41", "ovs_interfaceid": "fdc91f1d-4112-4f4c-9fd8-c4c5f35a61ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1633.429144] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Releasing lock "refresh_cache-47aac36c-3f70-40a8-ab60-cebba86d3f85" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1633.429144] env[59577]: DEBUG nova.compute.manager [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Instance network_info: |[{"id": "fdc91f1d-4112-4f4c-9fd8-c4c5f35a61ae", "address": "fa:16:3e:0d:bc:e3", "network": {"id": "cdcb4580-55bf-4bf1-8ab7-1a73dbc4d42a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-2094093215-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9b03ba07ab9c413f8895ab62a8379a1c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0c293d47-74c0-49d7-a474-cdb643080f6f", "external-id": "nsx-vlan-transportzone-172", "segmentation_id": 172, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfdc91f1d-41", "ovs_interfaceid": "fdc91f1d-4112-4f4c-9fd8-c4c5f35a61ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1633.430366] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:0d:bc:e3', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '0c293d47-74c0-49d7-a474-cdb643080f6f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'fdc91f1d-4112-4f4c-9fd8-c4c5f35a61ae', 'vif_model': 'vmxnet3'}] {{(pid=59577) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1633.439692] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Creating folder: Project (9b03ba07ab9c413f8895ab62a8379a1c). Parent ref: group-v398749. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1633.440117] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6f5ebf5f-5674-4f93-a4cc-86aadc19c8a6 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1633.454188] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Created folder: Project (9b03ba07ab9c413f8895ab62a8379a1c) in parent group-v398749. [ 1633.454188] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Creating folder: Instances. Parent ref: group-v398772. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1633.458358] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-24fb56b3-205c-4fb1-86d4-a56047c69fc3 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1633.469031] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Created folder: Instances in parent group-v398772. [ 1633.469241] env[59577]: DEBUG oslo.service.loopingcall [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59577) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1633.469442] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Creating VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1633.469646] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f8bd1a82-9183-4e81-b647-7796f778101b {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1633.491704] env[59577]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1633.491704] env[59577]: value = "task-1933767" [ 1633.491704] env[59577]: _type = "Task" [ 1633.491704] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1633.500756] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933767, 'name': CreateVM_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1633.574822] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933764, 'name': CreateVM_Task, 'duration_secs': 0.714774} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1633.574822] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1a375c37-fcec-4442-827d-103352e81035] Created VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1633.575338] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1633.576100] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1633.576100] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1633.576100] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-10cdd886-fe11-4874-a4db-05d021928b50 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1633.581633] env[59577]: DEBUG oslo_vmware.api [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Waiting for the task: (returnval){ [ 1633.581633] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]525ee379-7da2-8cd8-9d45-a67fc892d8d8" [ 1633.581633] env[59577]: _type = "Task" [ 1633.581633] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1633.591439] env[59577]: DEBUG oslo_vmware.api [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]525ee379-7da2-8cd8-9d45-a67fc892d8d8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1634.001801] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933767, 'name': CreateVM_Task, 'duration_secs': 0.50605} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1634.002033] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Created VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1634.002645] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1634.091116] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1634.091386] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Processing image d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1634.091596] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1634.091798] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1634.092129] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1634.092451] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-762a2b91-b5e2-4de3-9773-eef03d61dd83 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1634.098620] env[59577]: DEBUG oslo_vmware.api [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Waiting for the task: (returnval){ [ 1634.098620] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]520cdcf5-f4a5-1cab-259f-cdef38da758f" [ 1634.098620] env[59577]: _type = "Task" [ 1634.098620] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1634.106052] env[59577]: DEBUG oslo_vmware.api [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]520cdcf5-f4a5-1cab-259f-cdef38da758f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1634.199869] env[59577]: DEBUG nova.compute.manager [req-e42341f2-eba4-465d-91a0-62fc9942ffc7 req-39df6596-8f83-4c1a-87c5-49cc7fc677b6 service nova] [instance: 1a375c37-fcec-4442-827d-103352e81035] Received event network-vif-plugged-9975edaa-cbba-491b-b8a7-ad6fccdcdf24 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1634.199869] env[59577]: DEBUG oslo_concurrency.lockutils [req-e42341f2-eba4-465d-91a0-62fc9942ffc7 req-39df6596-8f83-4c1a-87c5-49cc7fc677b6 service nova] Acquiring lock "1a375c37-fcec-4442-827d-103352e81035-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1634.199869] env[59577]: DEBUG oslo_concurrency.lockutils [req-e42341f2-eba4-465d-91a0-62fc9942ffc7 req-39df6596-8f83-4c1a-87c5-49cc7fc677b6 service nova] Lock "1a375c37-fcec-4442-827d-103352e81035-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1634.199869] env[59577]: DEBUG oslo_concurrency.lockutils [req-e42341f2-eba4-465d-91a0-62fc9942ffc7 req-39df6596-8f83-4c1a-87c5-49cc7fc677b6 service nova] Lock "1a375c37-fcec-4442-827d-103352e81035-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1634.200473] env[59577]: DEBUG nova.compute.manager [req-e42341f2-eba4-465d-91a0-62fc9942ffc7 req-39df6596-8f83-4c1a-87c5-49cc7fc677b6 service nova] [instance: 1a375c37-fcec-4442-827d-103352e81035] No waiting events found dispatching network-vif-plugged-9975edaa-cbba-491b-b8a7-ad6fccdcdf24 {{(pid=59577) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1634.200473] env[59577]: WARNING nova.compute.manager [req-e42341f2-eba4-465d-91a0-62fc9942ffc7 req-39df6596-8f83-4c1a-87c5-49cc7fc677b6 service nova] [instance: 1a375c37-fcec-4442-827d-103352e81035] Received unexpected event network-vif-plugged-9975edaa-cbba-491b-b8a7-ad6fccdcdf24 for instance with vm_state building and task_state spawning. [ 1634.313210] env[59577]: DEBUG nova.network.neutron [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Updating instance_info_cache with network_info: [{"id": "c73f9fa7-46ad-4abb-b15a-e9ae9c7fe9d3", "address": "fa:16:3e:71:bd:b4", "network": {"id": "26b67ca1-6917-4a10-baab-19a723b51193", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-537555915-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "68e8db5d904c464d90ceaf914109f393", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "415e68b4-3766-4359-afe2-f8563910d98c", "external-id": "nsx-vlan-transportzone-538", "segmentation_id": 538, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc73f9fa7-46", "ovs_interfaceid": "c73f9fa7-46ad-4abb-b15a-e9ae9c7fe9d3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1634.328380] env[59577]: DEBUG oslo_concurrency.lockutils [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Releasing lock "refresh_cache-e9b9f5db-afac-494e-9850-c0d82f26fc68" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1634.328800] env[59577]: DEBUG nova.compute.manager [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Instance network_info: |[{"id": "c73f9fa7-46ad-4abb-b15a-e9ae9c7fe9d3", "address": "fa:16:3e:71:bd:b4", "network": {"id": "26b67ca1-6917-4a10-baab-19a723b51193", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-537555915-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "68e8db5d904c464d90ceaf914109f393", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "415e68b4-3766-4359-afe2-f8563910d98c", "external-id": "nsx-vlan-transportzone-538", "segmentation_id": 538, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc73f9fa7-46", "ovs_interfaceid": "c73f9fa7-46ad-4abb-b15a-e9ae9c7fe9d3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1634.329637] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:71:bd:b4', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '415e68b4-3766-4359-afe2-f8563910d98c', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c73f9fa7-46ad-4abb-b15a-e9ae9c7fe9d3', 'vif_model': 'vmxnet3'}] {{(pid=59577) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1634.338529] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Creating folder: Project (68e8db5d904c464d90ceaf914109f393). Parent ref: group-v398749. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1634.339115] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-caf84a6a-d0af-498b-b266-724dab8d4604 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1634.352312] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Created folder: Project (68e8db5d904c464d90ceaf914109f393) in parent group-v398749. [ 1634.352312] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Creating folder: Instances. Parent ref: group-v398775. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1634.352588] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-85960643-c7b4-4a31-bd05-11a0219cdfc7 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1634.364411] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Created folder: Instances in parent group-v398775. [ 1634.364667] env[59577]: DEBUG oslo.service.loopingcall [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59577) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1634.364857] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Creating VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1634.365075] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f6b7c6ee-248c-4e59-a58b-5e3cbd64f36e {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1634.393085] env[59577]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1634.393085] env[59577]: value = "task-1933770" [ 1634.393085] env[59577]: _type = "Task" [ 1634.393085] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1634.402581] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933770, 'name': CreateVM_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1634.613088] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1634.613088] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Processing image d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1634.613088] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1634.698704] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Acquiring lock "1ebb8847-1932-4ed6-8e56-bf48952cfc9c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1634.699478] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Lock "1ebb8847-1932-4ed6-8e56-bf48952cfc9c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1634.903213] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933770, 'name': CreateVM_Task, 'duration_secs': 0.320835} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1634.903494] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Created VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1634.904250] env[59577]: DEBUG oslo_concurrency.lockutils [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1634.904459] env[59577]: DEBUG oslo_concurrency.lockutils [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1634.904861] env[59577]: DEBUG oslo_concurrency.lockutils [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1634.905146] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2b590bc1-9ce9-486e-a123-119806141208 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1634.909882] env[59577]: DEBUG oslo_vmware.api [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Waiting for the task: (returnval){ [ 1634.909882] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]5215c46b-9f09-e24f-6498-ef8887400cc1" [ 1634.909882] env[59577]: _type = "Task" [ 1634.909882] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1634.917804] env[59577]: DEBUG oslo_vmware.api [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]5215c46b-9f09-e24f-6498-ef8887400cc1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1635.150095] env[59577]: DEBUG nova.compute.manager [req-bfb0fdbe-eb42-4494-82fc-3197ef6db939 req-49de73a5-648e-4d81-b7a3-080ddeb120ec service nova] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Received event network-changed-760c39d4-271c-4e5c-bcb2-27aa69984700 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1635.150095] env[59577]: DEBUG nova.compute.manager [req-bfb0fdbe-eb42-4494-82fc-3197ef6db939 req-49de73a5-648e-4d81-b7a3-080ddeb120ec service nova] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Refreshing instance network info cache due to event network-changed-760c39d4-271c-4e5c-bcb2-27aa69984700. {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1635.150253] env[59577]: DEBUG oslo_concurrency.lockutils [req-bfb0fdbe-eb42-4494-82fc-3197ef6db939 req-49de73a5-648e-4d81-b7a3-080ddeb120ec service nova] Acquiring lock "refresh_cache-cc3276aa-0d5a-4a14-ae90-e20a1b823bd3" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1635.150560] env[59577]: DEBUG oslo_concurrency.lockutils [req-bfb0fdbe-eb42-4494-82fc-3197ef6db939 req-49de73a5-648e-4d81-b7a3-080ddeb120ec service nova] Acquired lock "refresh_cache-cc3276aa-0d5a-4a14-ae90-e20a1b823bd3" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1635.150836] env[59577]: DEBUG nova.network.neutron [req-bfb0fdbe-eb42-4494-82fc-3197ef6db939 req-49de73a5-648e-4d81-b7a3-080ddeb120ec service nova] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Refreshing network info cache for port 760c39d4-271c-4e5c-bcb2-27aa69984700 {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1635.423089] env[59577]: DEBUG oslo_concurrency.lockutils [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1635.423089] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Processing image d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1635.423511] env[59577]: DEBUG oslo_concurrency.lockutils [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1635.629926] env[59577]: DEBUG nova.network.neutron [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Successfully updated port: b8edabde-eff3-4547-b3c3-53d0e44af941 {{(pid=59577) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1635.642735] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Acquiring lock "refresh_cache-b9d0daac-02e6-4862-b3de-64223d5a4a76" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1635.642735] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Acquired lock "refresh_cache-b9d0daac-02e6-4862-b3de-64223d5a4a76" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1635.642735] env[59577]: DEBUG nova.network.neutron [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Building network info cache for instance {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1635.793740] env[59577]: DEBUG nova.network.neutron [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Instance cache missing network info. {{(pid=59577) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1636.490125] env[59577]: DEBUG nova.network.neutron [req-bfb0fdbe-eb42-4494-82fc-3197ef6db939 req-49de73a5-648e-4d81-b7a3-080ddeb120ec service nova] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Updated VIF entry in instance network info cache for port 760c39d4-271c-4e5c-bcb2-27aa69984700. {{(pid=59577) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1636.490452] env[59577]: DEBUG nova.network.neutron [req-bfb0fdbe-eb42-4494-82fc-3197ef6db939 req-49de73a5-648e-4d81-b7a3-080ddeb120ec service nova] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Updating instance_info_cache with network_info: [{"id": "760c39d4-271c-4e5c-bcb2-27aa69984700", "address": "fa:16:3e:2d:ee:e6", "network": {"id": "3a10561e-da46-4351-98b3-035fd712ec58", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-6864498-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b175e7e25a3f4a428193fc7782f957d5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2be3fdb5-359e-43bd-8c20-2ff00e81db55", "external-id": "nsx-vlan-transportzone-986", "segmentation_id": 986, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap760c39d4-27", "ovs_interfaceid": "760c39d4-271c-4e5c-bcb2-27aa69984700", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1636.505147] env[59577]: DEBUG oslo_concurrency.lockutils [req-bfb0fdbe-eb42-4494-82fc-3197ef6db939 req-49de73a5-648e-4d81-b7a3-080ddeb120ec service nova] Releasing lock "refresh_cache-cc3276aa-0d5a-4a14-ae90-e20a1b823bd3" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1636.505651] env[59577]: DEBUG nova.compute.manager [req-bfb0fdbe-eb42-4494-82fc-3197ef6db939 req-49de73a5-648e-4d81-b7a3-080ddeb120ec service nova] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Received event network-vif-plugged-99e66f6f-e73f-444b-852e-36b9125498c3 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1636.505969] env[59577]: DEBUG oslo_concurrency.lockutils [req-bfb0fdbe-eb42-4494-82fc-3197ef6db939 req-49de73a5-648e-4d81-b7a3-080ddeb120ec service nova] Acquiring lock "ee50624e-74d6-4afc-9fba-c541f1b83554-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1636.506299] env[59577]: DEBUG oslo_concurrency.lockutils [req-bfb0fdbe-eb42-4494-82fc-3197ef6db939 req-49de73a5-648e-4d81-b7a3-080ddeb120ec service nova] Lock "ee50624e-74d6-4afc-9fba-c541f1b83554-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1636.506562] env[59577]: DEBUG oslo_concurrency.lockutils [req-bfb0fdbe-eb42-4494-82fc-3197ef6db939 req-49de73a5-648e-4d81-b7a3-080ddeb120ec service nova] Lock "ee50624e-74d6-4afc-9fba-c541f1b83554-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1636.507877] env[59577]: DEBUG nova.compute.manager [req-bfb0fdbe-eb42-4494-82fc-3197ef6db939 req-49de73a5-648e-4d81-b7a3-080ddeb120ec service nova] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] No waiting events found dispatching network-vif-plugged-99e66f6f-e73f-444b-852e-36b9125498c3 {{(pid=59577) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1636.507877] env[59577]: WARNING nova.compute.manager [req-bfb0fdbe-eb42-4494-82fc-3197ef6db939 req-49de73a5-648e-4d81-b7a3-080ddeb120ec service nova] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Received unexpected event network-vif-plugged-99e66f6f-e73f-444b-852e-36b9125498c3 for instance with vm_state building and task_state spawning. [ 1636.507877] env[59577]: DEBUG nova.compute.manager [req-bfb0fdbe-eb42-4494-82fc-3197ef6db939 req-49de73a5-648e-4d81-b7a3-080ddeb120ec service nova] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Received event network-changed-99e66f6f-e73f-444b-852e-36b9125498c3 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1636.507877] env[59577]: DEBUG nova.compute.manager [req-bfb0fdbe-eb42-4494-82fc-3197ef6db939 req-49de73a5-648e-4d81-b7a3-080ddeb120ec service nova] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Refreshing instance network info cache due to event network-changed-99e66f6f-e73f-444b-852e-36b9125498c3. {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1636.507877] env[59577]: DEBUG oslo_concurrency.lockutils [req-bfb0fdbe-eb42-4494-82fc-3197ef6db939 req-49de73a5-648e-4d81-b7a3-080ddeb120ec service nova] Acquiring lock "refresh_cache-ee50624e-74d6-4afc-9fba-c541f1b83554" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1636.508214] env[59577]: DEBUG oslo_concurrency.lockutils [req-bfb0fdbe-eb42-4494-82fc-3197ef6db939 req-49de73a5-648e-4d81-b7a3-080ddeb120ec service nova] Acquired lock "refresh_cache-ee50624e-74d6-4afc-9fba-c541f1b83554" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1636.508214] env[59577]: DEBUG nova.network.neutron [req-bfb0fdbe-eb42-4494-82fc-3197ef6db939 req-49de73a5-648e-4d81-b7a3-080ddeb120ec service nova] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Refreshing network info cache for port 99e66f6f-e73f-444b-852e-36b9125498c3 {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1636.614423] env[59577]: DEBUG nova.network.neutron [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Updating instance_info_cache with network_info: [{"id": "b8edabde-eff3-4547-b3c3-53d0e44af941", "address": "fa:16:3e:c3:2b:b9", "network": {"id": "cdcb4580-55bf-4bf1-8ab7-1a73dbc4d42a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-2094093215-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9b03ba07ab9c413f8895ab62a8379a1c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0c293d47-74c0-49d7-a474-cdb643080f6f", "external-id": "nsx-vlan-transportzone-172", "segmentation_id": 172, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb8edabde-ef", "ovs_interfaceid": "b8edabde-eff3-4547-b3c3-53d0e44af941", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1636.630527] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Releasing lock "refresh_cache-b9d0daac-02e6-4862-b3de-64223d5a4a76" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1636.630833] env[59577]: DEBUG nova.compute.manager [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Instance network_info: |[{"id": "b8edabde-eff3-4547-b3c3-53d0e44af941", "address": "fa:16:3e:c3:2b:b9", "network": {"id": "cdcb4580-55bf-4bf1-8ab7-1a73dbc4d42a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-2094093215-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9b03ba07ab9c413f8895ab62a8379a1c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0c293d47-74c0-49d7-a474-cdb643080f6f", "external-id": "nsx-vlan-transportzone-172", "segmentation_id": 172, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb8edabde-ef", "ovs_interfaceid": "b8edabde-eff3-4547-b3c3-53d0e44af941", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1636.633140] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c3:2b:b9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '0c293d47-74c0-49d7-a474-cdb643080f6f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b8edabde-eff3-4547-b3c3-53d0e44af941', 'vif_model': 'vmxnet3'}] {{(pid=59577) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1636.644646] env[59577]: DEBUG oslo.service.loopingcall [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59577) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1636.645240] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Creating VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1636.647046] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-11a067c4-9946-4d5c-89ae-01ae4a607449 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1636.672325] env[59577]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1636.672325] env[59577]: value = "task-1933771" [ 1636.672325] env[59577]: _type = "Task" [ 1636.672325] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1636.679672] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933771, 'name': CreateVM_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1637.009121] env[59577]: DEBUG nova.network.neutron [req-bfb0fdbe-eb42-4494-82fc-3197ef6db939 req-49de73a5-648e-4d81-b7a3-080ddeb120ec service nova] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Updated VIF entry in instance network info cache for port 99e66f6f-e73f-444b-852e-36b9125498c3. {{(pid=59577) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1637.009279] env[59577]: DEBUG nova.network.neutron [req-bfb0fdbe-eb42-4494-82fc-3197ef6db939 req-49de73a5-648e-4d81-b7a3-080ddeb120ec service nova] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Updating instance_info_cache with network_info: [{"id": "99e66f6f-e73f-444b-852e-36b9125498c3", "address": "fa:16:3e:34:9e:96", "network": {"id": "d46458c9-9414-4916-b34e-8b5535846531", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1112556393-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d360139a6dd143dc97c554120e164c52", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ba4f6497-e2b4-43b5-9819-6927865ae974", "external-id": "nsx-vlan-transportzone-112", "segmentation_id": 112, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap99e66f6f-e7", "ovs_interfaceid": "99e66f6f-e73f-444b-852e-36b9125498c3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1637.025265] env[59577]: DEBUG oslo_concurrency.lockutils [req-bfb0fdbe-eb42-4494-82fc-3197ef6db939 req-49de73a5-648e-4d81-b7a3-080ddeb120ec service nova] Releasing lock "refresh_cache-ee50624e-74d6-4afc-9fba-c541f1b83554" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1637.184040] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933771, 'name': CreateVM_Task, 'duration_secs': 0.304699} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1637.184040] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Created VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1637.184040] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1637.184040] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1637.184040] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1637.184314] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b74d7059-72b2-4814-b6d7-3d0351569753 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1637.189181] env[59577]: DEBUG oslo_vmware.api [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Waiting for the task: (returnval){ [ 1637.189181] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]522c596a-25df-44d3-3d47-5bd1b0023955" [ 1637.189181] env[59577]: _type = "Task" [ 1637.189181] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1637.198204] env[59577]: DEBUG oslo_vmware.api [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]522c596a-25df-44d3-3d47-5bd1b0023955, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1637.533129] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Acquiring lock "b8002da2-eecd-490a-a34b-c651c28c57fc" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1637.533129] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Lock "b8002da2-eecd-490a-a34b-c651c28c57fc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1637.700813] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1637.701085] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Processing image d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1637.701293] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1638.055994] env[59577]: DEBUG nova.compute.manager [req-5e5cfda2-f7db-4df8-abfe-12bd1efb4e80 req-c606eb52-b727-412f-bf7e-0123769ad03d service nova] [instance: 1a375c37-fcec-4442-827d-103352e81035] Received event network-changed-9975edaa-cbba-491b-b8a7-ad6fccdcdf24 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1638.056221] env[59577]: DEBUG nova.compute.manager [req-5e5cfda2-f7db-4df8-abfe-12bd1efb4e80 req-c606eb52-b727-412f-bf7e-0123769ad03d service nova] [instance: 1a375c37-fcec-4442-827d-103352e81035] Refreshing instance network info cache due to event network-changed-9975edaa-cbba-491b-b8a7-ad6fccdcdf24. {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1638.056435] env[59577]: DEBUG oslo_concurrency.lockutils [req-5e5cfda2-f7db-4df8-abfe-12bd1efb4e80 req-c606eb52-b727-412f-bf7e-0123769ad03d service nova] Acquiring lock "refresh_cache-1a375c37-fcec-4442-827d-103352e81035" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1638.056579] env[59577]: DEBUG oslo_concurrency.lockutils [req-5e5cfda2-f7db-4df8-abfe-12bd1efb4e80 req-c606eb52-b727-412f-bf7e-0123769ad03d service nova] Acquired lock "refresh_cache-1a375c37-fcec-4442-827d-103352e81035" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1638.056738] env[59577]: DEBUG nova.network.neutron [req-5e5cfda2-f7db-4df8-abfe-12bd1efb4e80 req-c606eb52-b727-412f-bf7e-0123769ad03d service nova] [instance: 1a375c37-fcec-4442-827d-103352e81035] Refreshing network info cache for port 9975edaa-cbba-491b-b8a7-ad6fccdcdf24 {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1638.311714] env[59577]: DEBUG oslo_concurrency.lockutils [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Acquiring lock "077b8c8d-ee7e-495b-a7f7-676fe7c70f83" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1638.312034] env[59577]: DEBUG oslo_concurrency.lockutils [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Lock "077b8c8d-ee7e-495b-a7f7-676fe7c70f83" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1638.855216] env[59577]: DEBUG nova.network.neutron [req-5e5cfda2-f7db-4df8-abfe-12bd1efb4e80 req-c606eb52-b727-412f-bf7e-0123769ad03d service nova] [instance: 1a375c37-fcec-4442-827d-103352e81035] Updated VIF entry in instance network info cache for port 9975edaa-cbba-491b-b8a7-ad6fccdcdf24. {{(pid=59577) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1638.855588] env[59577]: DEBUG nova.network.neutron [req-5e5cfda2-f7db-4df8-abfe-12bd1efb4e80 req-c606eb52-b727-412f-bf7e-0123769ad03d service nova] [instance: 1a375c37-fcec-4442-827d-103352e81035] Updating instance_info_cache with network_info: [{"id": "9975edaa-cbba-491b-b8a7-ad6fccdcdf24", "address": "fa:16:3e:b4:8b:fa", "network": {"id": "660f3ea5-92a2-4c13-bda2-14d48e0151bb", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1064616051-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7dd93200661c419f8ca4baf28c094a99", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "32faf59b-014c-4f1f-8331-40df95bf741f", "external-id": "nsx-vlan-transportzone-996", "segmentation_id": 996, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9975edaa-cb", "ovs_interfaceid": "9975edaa-cbba-491b-b8a7-ad6fccdcdf24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1638.871393] env[59577]: DEBUG oslo_concurrency.lockutils [req-5e5cfda2-f7db-4df8-abfe-12bd1efb4e80 req-c606eb52-b727-412f-bf7e-0123769ad03d service nova] Releasing lock "refresh_cache-1a375c37-fcec-4442-827d-103352e81035" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1638.871658] env[59577]: DEBUG nova.compute.manager [req-5e5cfda2-f7db-4df8-abfe-12bd1efb4e80 req-c606eb52-b727-412f-bf7e-0123769ad03d service nova] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Received event network-vif-plugged-fdc91f1d-4112-4f4c-9fd8-c4c5f35a61ae {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1638.871848] env[59577]: DEBUG oslo_concurrency.lockutils [req-5e5cfda2-f7db-4df8-abfe-12bd1efb4e80 req-c606eb52-b727-412f-bf7e-0123769ad03d service nova] Acquiring lock "47aac36c-3f70-40a8-ab60-cebba86d3f85-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1638.874808] env[59577]: DEBUG oslo_concurrency.lockutils [req-5e5cfda2-f7db-4df8-abfe-12bd1efb4e80 req-c606eb52-b727-412f-bf7e-0123769ad03d service nova] Lock "47aac36c-3f70-40a8-ab60-cebba86d3f85-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1638.874808] env[59577]: DEBUG oslo_concurrency.lockutils [req-5e5cfda2-f7db-4df8-abfe-12bd1efb4e80 req-c606eb52-b727-412f-bf7e-0123769ad03d service nova] Lock "47aac36c-3f70-40a8-ab60-cebba86d3f85-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1638.874808] env[59577]: DEBUG nova.compute.manager [req-5e5cfda2-f7db-4df8-abfe-12bd1efb4e80 req-c606eb52-b727-412f-bf7e-0123769ad03d service nova] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] No waiting events found dispatching network-vif-plugged-fdc91f1d-4112-4f4c-9fd8-c4c5f35a61ae {{(pid=59577) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1638.874808] env[59577]: WARNING nova.compute.manager [req-5e5cfda2-f7db-4df8-abfe-12bd1efb4e80 req-c606eb52-b727-412f-bf7e-0123769ad03d service nova] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Received unexpected event network-vif-plugged-fdc91f1d-4112-4f4c-9fd8-c4c5f35a61ae for instance with vm_state building and task_state spawning. [ 1638.875077] env[59577]: DEBUG nova.compute.manager [req-5e5cfda2-f7db-4df8-abfe-12bd1efb4e80 req-c606eb52-b727-412f-bf7e-0123769ad03d service nova] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Received event network-changed-fdc91f1d-4112-4f4c-9fd8-c4c5f35a61ae {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1638.875077] env[59577]: DEBUG nova.compute.manager [req-5e5cfda2-f7db-4df8-abfe-12bd1efb4e80 req-c606eb52-b727-412f-bf7e-0123769ad03d service nova] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Refreshing instance network info cache due to event network-changed-fdc91f1d-4112-4f4c-9fd8-c4c5f35a61ae. {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1638.875077] env[59577]: DEBUG oslo_concurrency.lockutils [req-5e5cfda2-f7db-4df8-abfe-12bd1efb4e80 req-c606eb52-b727-412f-bf7e-0123769ad03d service nova] Acquiring lock "refresh_cache-47aac36c-3f70-40a8-ab60-cebba86d3f85" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1638.875077] env[59577]: DEBUG oslo_concurrency.lockutils [req-5e5cfda2-f7db-4df8-abfe-12bd1efb4e80 req-c606eb52-b727-412f-bf7e-0123769ad03d service nova] Acquired lock "refresh_cache-47aac36c-3f70-40a8-ab60-cebba86d3f85" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1638.875077] env[59577]: DEBUG nova.network.neutron [req-5e5cfda2-f7db-4df8-abfe-12bd1efb4e80 req-c606eb52-b727-412f-bf7e-0123769ad03d service nova] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Refreshing network info cache for port fdc91f1d-4112-4f4c-9fd8-c4c5f35a61ae {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1639.219082] env[59577]: DEBUG nova.compute.manager [req-9be0b56d-471f-4f6e-b62e-4df1c1d93ec6 req-dcb72307-8ad2-4731-9190-8082ce37ec0a service nova] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Received event network-vif-plugged-c73f9fa7-46ad-4abb-b15a-e9ae9c7fe9d3 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1639.223098] env[59577]: DEBUG oslo_concurrency.lockutils [req-9be0b56d-471f-4f6e-b62e-4df1c1d93ec6 req-dcb72307-8ad2-4731-9190-8082ce37ec0a service nova] Acquiring lock "e9b9f5db-afac-494e-9850-c0d82f26fc68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1639.223414] env[59577]: DEBUG oslo_concurrency.lockutils [req-9be0b56d-471f-4f6e-b62e-4df1c1d93ec6 req-dcb72307-8ad2-4731-9190-8082ce37ec0a service nova] Lock "e9b9f5db-afac-494e-9850-c0d82f26fc68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.004s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1639.223802] env[59577]: DEBUG oslo_concurrency.lockutils [req-9be0b56d-471f-4f6e-b62e-4df1c1d93ec6 req-dcb72307-8ad2-4731-9190-8082ce37ec0a service nova] Lock "e9b9f5db-afac-494e-9850-c0d82f26fc68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1639.223802] env[59577]: DEBUG nova.compute.manager [req-9be0b56d-471f-4f6e-b62e-4df1c1d93ec6 req-dcb72307-8ad2-4731-9190-8082ce37ec0a service nova] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] No waiting events found dispatching network-vif-plugged-c73f9fa7-46ad-4abb-b15a-e9ae9c7fe9d3 {{(pid=59577) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1639.223916] env[59577]: WARNING nova.compute.manager [req-9be0b56d-471f-4f6e-b62e-4df1c1d93ec6 req-dcb72307-8ad2-4731-9190-8082ce37ec0a service nova] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Received unexpected event network-vif-plugged-c73f9fa7-46ad-4abb-b15a-e9ae9c7fe9d3 for instance with vm_state building and task_state spawning. [ 1639.224089] env[59577]: DEBUG nova.compute.manager [req-9be0b56d-471f-4f6e-b62e-4df1c1d93ec6 req-dcb72307-8ad2-4731-9190-8082ce37ec0a service nova] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Received event network-changed-c73f9fa7-46ad-4abb-b15a-e9ae9c7fe9d3 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1639.224242] env[59577]: DEBUG nova.compute.manager [req-9be0b56d-471f-4f6e-b62e-4df1c1d93ec6 req-dcb72307-8ad2-4731-9190-8082ce37ec0a service nova] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Refreshing instance network info cache due to event network-changed-c73f9fa7-46ad-4abb-b15a-e9ae9c7fe9d3. {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1639.224427] env[59577]: DEBUG oslo_concurrency.lockutils [req-9be0b56d-471f-4f6e-b62e-4df1c1d93ec6 req-dcb72307-8ad2-4731-9190-8082ce37ec0a service nova] Acquiring lock "refresh_cache-e9b9f5db-afac-494e-9850-c0d82f26fc68" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1639.224602] env[59577]: DEBUG oslo_concurrency.lockutils [req-9be0b56d-471f-4f6e-b62e-4df1c1d93ec6 req-dcb72307-8ad2-4731-9190-8082ce37ec0a service nova] Acquired lock "refresh_cache-e9b9f5db-afac-494e-9850-c0d82f26fc68" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1639.224768] env[59577]: DEBUG nova.network.neutron [req-9be0b56d-471f-4f6e-b62e-4df1c1d93ec6 req-dcb72307-8ad2-4731-9190-8082ce37ec0a service nova] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Refreshing network info cache for port c73f9fa7-46ad-4abb-b15a-e9ae9c7fe9d3 {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1639.257678] env[59577]: DEBUG oslo_concurrency.lockutils [None req-9049e4db-20ea-4ff1-a288-b177a2f749ce tempest-ServerActionsTestJSON-1026393023 tempest-ServerActionsTestJSON-1026393023-project-member] Acquiring lock "640c1048-dca1-4fbd-889b-cd8aa23eb3f2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1639.258024] env[59577]: DEBUG oslo_concurrency.lockutils [None req-9049e4db-20ea-4ff1-a288-b177a2f749ce tempest-ServerActionsTestJSON-1026393023 tempest-ServerActionsTestJSON-1026393023-project-member] Lock "640c1048-dca1-4fbd-889b-cd8aa23eb3f2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1639.310870] env[59577]: DEBUG oslo_concurrency.lockutils [None req-05ebad77-e053-4aa6-9355-404a57c4f831 tempest-AttachVolumeTestJSON-342258731 tempest-AttachVolumeTestJSON-342258731-project-member] Acquiring lock "d25dbbad-94bb-4147-aa56-91aaab4ed077" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1639.311289] env[59577]: DEBUG oslo_concurrency.lockutils [None req-05ebad77-e053-4aa6-9355-404a57c4f831 tempest-AttachVolumeTestJSON-342258731 tempest-AttachVolumeTestJSON-342258731-project-member] Lock "d25dbbad-94bb-4147-aa56-91aaab4ed077" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1639.839225] env[59577]: DEBUG nova.network.neutron [req-5e5cfda2-f7db-4df8-abfe-12bd1efb4e80 req-c606eb52-b727-412f-bf7e-0123769ad03d service nova] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Updated VIF entry in instance network info cache for port fdc91f1d-4112-4f4c-9fd8-c4c5f35a61ae. {{(pid=59577) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1639.839225] env[59577]: DEBUG nova.network.neutron [req-5e5cfda2-f7db-4df8-abfe-12bd1efb4e80 req-c606eb52-b727-412f-bf7e-0123769ad03d service nova] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Updating instance_info_cache with network_info: [{"id": "fdc91f1d-4112-4f4c-9fd8-c4c5f35a61ae", "address": "fa:16:3e:0d:bc:e3", "network": {"id": "cdcb4580-55bf-4bf1-8ab7-1a73dbc4d42a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-2094093215-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9b03ba07ab9c413f8895ab62a8379a1c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0c293d47-74c0-49d7-a474-cdb643080f6f", "external-id": "nsx-vlan-transportzone-172", "segmentation_id": 172, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfdc91f1d-41", "ovs_interfaceid": "fdc91f1d-4112-4f4c-9fd8-c4c5f35a61ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1639.855825] env[59577]: DEBUG oslo_concurrency.lockutils [req-5e5cfda2-f7db-4df8-abfe-12bd1efb4e80 req-c606eb52-b727-412f-bf7e-0123769ad03d service nova] Releasing lock "refresh_cache-47aac36c-3f70-40a8-ab60-cebba86d3f85" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1640.087007] env[59577]: DEBUG oslo_concurrency.lockutils [None req-2bc953ea-fe68-497c-b3dd-b82e0ba19246 tempest-ServerDiagnosticsV248Test-1906389932 tempest-ServerDiagnosticsV248Test-1906389932-project-member] Acquiring lock "7b8c78af-c9f9-4bff-a667-6faa0fb7c482" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1640.087259] env[59577]: DEBUG oslo_concurrency.lockutils [None req-2bc953ea-fe68-497c-b3dd-b82e0ba19246 tempest-ServerDiagnosticsV248Test-1906389932 tempest-ServerDiagnosticsV248Test-1906389932-project-member] Lock "7b8c78af-c9f9-4bff-a667-6faa0fb7c482" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1640.243880] env[59577]: DEBUG nova.network.neutron [req-9be0b56d-471f-4f6e-b62e-4df1c1d93ec6 req-dcb72307-8ad2-4731-9190-8082ce37ec0a service nova] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Updated VIF entry in instance network info cache for port c73f9fa7-46ad-4abb-b15a-e9ae9c7fe9d3. {{(pid=59577) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1640.244243] env[59577]: DEBUG nova.network.neutron [req-9be0b56d-471f-4f6e-b62e-4df1c1d93ec6 req-dcb72307-8ad2-4731-9190-8082ce37ec0a service nova] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Updating instance_info_cache with network_info: [{"id": "c73f9fa7-46ad-4abb-b15a-e9ae9c7fe9d3", "address": "fa:16:3e:71:bd:b4", "network": {"id": "26b67ca1-6917-4a10-baab-19a723b51193", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-537555915-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "68e8db5d904c464d90ceaf914109f393", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "415e68b4-3766-4359-afe2-f8563910d98c", "external-id": "nsx-vlan-transportzone-538", "segmentation_id": 538, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc73f9fa7-46", "ovs_interfaceid": "c73f9fa7-46ad-4abb-b15a-e9ae9c7fe9d3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1640.258014] env[59577]: DEBUG oslo_concurrency.lockutils [req-9be0b56d-471f-4f6e-b62e-4df1c1d93ec6 req-dcb72307-8ad2-4731-9190-8082ce37ec0a service nova] Releasing lock "refresh_cache-e9b9f5db-afac-494e-9850-c0d82f26fc68" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1640.258335] env[59577]: DEBUG nova.compute.manager [req-9be0b56d-471f-4f6e-b62e-4df1c1d93ec6 req-dcb72307-8ad2-4731-9190-8082ce37ec0a service nova] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Received event network-vif-plugged-b8edabde-eff3-4547-b3c3-53d0e44af941 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1640.259309] env[59577]: DEBUG oslo_concurrency.lockutils [req-9be0b56d-471f-4f6e-b62e-4df1c1d93ec6 req-dcb72307-8ad2-4731-9190-8082ce37ec0a service nova] Acquiring lock "b9d0daac-02e6-4862-b3de-64223d5a4a76-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1640.259585] env[59577]: DEBUG oslo_concurrency.lockutils [req-9be0b56d-471f-4f6e-b62e-4df1c1d93ec6 req-dcb72307-8ad2-4731-9190-8082ce37ec0a service nova] Lock "b9d0daac-02e6-4862-b3de-64223d5a4a76-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1640.259745] env[59577]: DEBUG oslo_concurrency.lockutils [req-9be0b56d-471f-4f6e-b62e-4df1c1d93ec6 req-dcb72307-8ad2-4731-9190-8082ce37ec0a service nova] Lock "b9d0daac-02e6-4862-b3de-64223d5a4a76-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1640.260060] env[59577]: DEBUG nova.compute.manager [req-9be0b56d-471f-4f6e-b62e-4df1c1d93ec6 req-dcb72307-8ad2-4731-9190-8082ce37ec0a service nova] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] No waiting events found dispatching network-vif-plugged-b8edabde-eff3-4547-b3c3-53d0e44af941 {{(pid=59577) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1640.260165] env[59577]: WARNING nova.compute.manager [req-9be0b56d-471f-4f6e-b62e-4df1c1d93ec6 req-dcb72307-8ad2-4731-9190-8082ce37ec0a service nova] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Received unexpected event network-vif-plugged-b8edabde-eff3-4547-b3c3-53d0e44af941 for instance with vm_state building and task_state spawning. [ 1640.260342] env[59577]: DEBUG nova.compute.manager [req-9be0b56d-471f-4f6e-b62e-4df1c1d93ec6 req-dcb72307-8ad2-4731-9190-8082ce37ec0a service nova] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Received event network-changed-b8edabde-eff3-4547-b3c3-53d0e44af941 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1640.260510] env[59577]: DEBUG nova.compute.manager [req-9be0b56d-471f-4f6e-b62e-4df1c1d93ec6 req-dcb72307-8ad2-4731-9190-8082ce37ec0a service nova] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Refreshing instance network info cache due to event network-changed-b8edabde-eff3-4547-b3c3-53d0e44af941. {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1640.260686] env[59577]: DEBUG oslo_concurrency.lockutils [req-9be0b56d-471f-4f6e-b62e-4df1c1d93ec6 req-dcb72307-8ad2-4731-9190-8082ce37ec0a service nova] Acquiring lock "refresh_cache-b9d0daac-02e6-4862-b3de-64223d5a4a76" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1640.260829] env[59577]: DEBUG oslo_concurrency.lockutils [req-9be0b56d-471f-4f6e-b62e-4df1c1d93ec6 req-dcb72307-8ad2-4731-9190-8082ce37ec0a service nova] Acquired lock "refresh_cache-b9d0daac-02e6-4862-b3de-64223d5a4a76" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1640.260987] env[59577]: DEBUG nova.network.neutron [req-9be0b56d-471f-4f6e-b62e-4df1c1d93ec6 req-dcb72307-8ad2-4731-9190-8082ce37ec0a service nova] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Refreshing network info cache for port b8edabde-eff3-4547-b3c3-53d0e44af941 {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1641.185289] env[59577]: DEBUG nova.network.neutron [req-9be0b56d-471f-4f6e-b62e-4df1c1d93ec6 req-dcb72307-8ad2-4731-9190-8082ce37ec0a service nova] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Updated VIF entry in instance network info cache for port b8edabde-eff3-4547-b3c3-53d0e44af941. {{(pid=59577) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1641.185541] env[59577]: DEBUG nova.network.neutron [req-9be0b56d-471f-4f6e-b62e-4df1c1d93ec6 req-dcb72307-8ad2-4731-9190-8082ce37ec0a service nova] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Updating instance_info_cache with network_info: [{"id": "b8edabde-eff3-4547-b3c3-53d0e44af941", "address": "fa:16:3e:c3:2b:b9", "network": {"id": "cdcb4580-55bf-4bf1-8ab7-1a73dbc4d42a", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-2094093215-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9b03ba07ab9c413f8895ab62a8379a1c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0c293d47-74c0-49d7-a474-cdb643080f6f", "external-id": "nsx-vlan-transportzone-172", "segmentation_id": 172, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb8edabde-ef", "ovs_interfaceid": "b8edabde-eff3-4547-b3c3-53d0e44af941", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1641.204866] env[59577]: DEBUG oslo_concurrency.lockutils [req-9be0b56d-471f-4f6e-b62e-4df1c1d93ec6 req-dcb72307-8ad2-4731-9190-8082ce37ec0a service nova] Releasing lock "refresh_cache-b9d0daac-02e6-4862-b3de-64223d5a4a76" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1641.414901] env[59577]: DEBUG oslo_concurrency.lockutils [None req-2d3a4d0b-cc26-464b-8c28-354c4dcce1d7 tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Acquiring lock "16a36155-12ef-40b2-b94d-db4619ac2f4b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1641.414901] env[59577]: DEBUG oslo_concurrency.lockutils [None req-2d3a4d0b-cc26-464b-8c28-354c4dcce1d7 tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Lock "16a36155-12ef-40b2-b94d-db4619ac2f4b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1644.236675] env[59577]: DEBUG oslo_concurrency.lockutils [None req-0f0396f5-8ca8-48e3-8f50-71e3ccba4a0f tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Acquiring lock "cefa22cd-9d7a-4b86-9ec5-bb9f005b42e0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1644.237122] env[59577]: DEBUG oslo_concurrency.lockutils [None req-0f0396f5-8ca8-48e3-8f50-71e3ccba4a0f tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Lock "cefa22cd-9d7a-4b86-9ec5-bb9f005b42e0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1645.434127] env[59577]: DEBUG oslo_concurrency.lockutils [None req-66f099a0-0d3b-4270-9770-d32e70f14d28 tempest-ServerActionsTestOtherA-455475876 tempest-ServerActionsTestOtherA-455475876-project-member] Acquiring lock "eaa781b6-6542-495d-8430-73416444d972" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1645.434425] env[59577]: DEBUG oslo_concurrency.lockutils [None req-66f099a0-0d3b-4270-9770-d32e70f14d28 tempest-ServerActionsTestOtherA-455475876 tempest-ServerActionsTestOtherA-455475876-project-member] Lock "eaa781b6-6542-495d-8430-73416444d972" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1646.030718] env[59577]: WARNING oslo_vmware.rw_handles [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1646.030718] env[59577]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1646.030718] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1646.030718] env[59577]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1646.030718] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1646.030718] env[59577]: ERROR oslo_vmware.rw_handles response.begin() [ 1646.030718] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1646.030718] env[59577]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1646.030718] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1646.030718] env[59577]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1646.030718] env[59577]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1646.030718] env[59577]: ERROR oslo_vmware.rw_handles [ 1646.031372] env[59577]: DEBUG nova.virt.vmwareapi.images [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Downloaded image file data d5e691af-5903-46f6-a589-e220c4e5798c to vmware_temp/9ba2c16a-703a-45f4-bd8e-2c1ba4c9ca2d/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk on the data store datastore1 {{(pid=59577) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1646.032668] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Caching image {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1646.032915] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Copying Virtual Disk [datastore1] vmware_temp/9ba2c16a-703a-45f4-bd8e-2c1ba4c9ca2d/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk to [datastore1] vmware_temp/9ba2c16a-703a-45f4-bd8e-2c1ba4c9ca2d/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk {{(pid=59577) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1646.033303] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-135b46d6-32ae-45a4-84ec-0545dda84c79 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1646.043153] env[59577]: DEBUG oslo_vmware.api [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Waiting for the task: (returnval){ [ 1646.043153] env[59577]: value = "task-1933772" [ 1646.043153] env[59577]: _type = "Task" [ 1646.043153] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1646.051214] env[59577]: DEBUG oslo_vmware.api [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Task: {'id': task-1933772, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1646.557022] env[59577]: DEBUG oslo_vmware.exceptions [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Fault InvalidArgument not matched. {{(pid=59577) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1646.557022] env[59577]: DEBUG oslo_concurrency.lockutils [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1646.557022] env[59577]: ERROR nova.compute.manager [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1646.557022] env[59577]: Faults: ['InvalidArgument'] [ 1646.557022] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Traceback (most recent call last): [ 1646.557022] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1646.557022] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] yield resources [ 1646.557022] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1646.557022] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] self.driver.spawn(context, instance, image_meta, [ 1646.557544] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1646.557544] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1646.557544] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1646.557544] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] self._fetch_image_if_missing(context, vi) [ 1646.557544] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1646.557544] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] image_cache(vi, tmp_image_ds_loc) [ 1646.557544] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1646.557544] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] vm_util.copy_virtual_disk( [ 1646.557544] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1646.557544] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] session._wait_for_task(vmdk_copy_task) [ 1646.557544] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1646.557544] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] return self.wait_for_task(task_ref) [ 1646.557544] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1646.557844] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] return evt.wait() [ 1646.557844] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1646.557844] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] result = hub.switch() [ 1646.557844] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1646.557844] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] return self.greenlet.switch() [ 1646.557844] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1646.557844] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] self.f(*self.args, **self.kw) [ 1646.557844] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1646.557844] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] raise exceptions.translate_fault(task_info.error) [ 1646.557844] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1646.557844] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Faults: ['InvalidArgument'] [ 1646.557844] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] [ 1646.557844] env[59577]: INFO nova.compute.manager [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Terminating instance [ 1646.559097] env[59577]: DEBUG oslo_concurrency.lockutils [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1646.559097] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1646.559412] env[59577]: DEBUG nova.compute.manager [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Start destroying the instance on the hypervisor. {{(pid=59577) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1646.559599] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Destroying instance {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1646.559825] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8a53d40d-b1cd-455f-8930-c040470f086e {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1646.567020] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9136c5f8-1bce-4698-8a04-f838dc3a4c9c {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1646.572124] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Unregistering the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1646.572546] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-04fbf4f0-8c03-409d-aae7-d27e95c5a971 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1646.575129] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1646.575444] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59577) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1646.576516] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b064d29b-c3b4-490f-96e4-b2e2082a6857 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1646.581601] env[59577]: DEBUG oslo_vmware.api [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Waiting for the task: (returnval){ [ 1646.581601] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]529dedda-ac09-0dd9-5615-39927c958b86" [ 1646.581601] env[59577]: _type = "Task" [ 1646.581601] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1646.598164] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Preparing fetch location {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1646.598668] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Creating directory with path [datastore1] vmware_temp/fe477266-7b32-4bef-bcbb-8a3aee556ead/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1646.599063] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5e513f11-5781-49aa-af98-271b07d86f62 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1646.624023] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Created directory with path [datastore1] vmware_temp/fe477266-7b32-4bef-bcbb-8a3aee556ead/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1646.624023] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Fetch image to [datastore1] vmware_temp/fe477266-7b32-4bef-bcbb-8a3aee556ead/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1646.624023] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to [datastore1] vmware_temp/fe477266-7b32-4bef-bcbb-8a3aee556ead/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk on the data store datastore1 {{(pid=59577) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1646.624023] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd18c41d-968d-4a1e-8d20-d18d8ef96dd0 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1646.634022] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75348994-d70f-4361-a387-645df8a4279c {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1646.642249] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ee518b4-1b8d-456b-bbfc-67586a9ff2ec {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1646.677151] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f07d2d36-5a1b-4c44-9142-d751e2504920 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1646.686172] env[59577]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-db246cf6-aba6-4b97-aaa8-7505ea68655b {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1646.719277] env[59577]: DEBUG nova.virt.vmwareapi.images [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to the data store datastore1 {{(pid=59577) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1646.784017] env[59577]: DEBUG oslo_vmware.rw_handles [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/fe477266-7b32-4bef-bcbb-8a3aee556ead/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59577) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1646.850124] env[59577]: DEBUG oslo_vmware.rw_handles [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Completed reading data from the image iterator. {{(pid=59577) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1646.850397] env[59577]: DEBUG oslo_vmware.rw_handles [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/fe477266-7b32-4bef-bcbb-8a3aee556ead/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59577) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1647.655658] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Unregistered the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1647.655967] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Deleting contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1647.658021] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Deleting the datastore file [datastore1] 02000be7-32ff-4158-8ce7-02bcefe7d81c {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1647.658021] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4a1b462a-65e1-40ca-9cc4-abca10e128d2 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1647.667588] env[59577]: DEBUG oslo_vmware.api [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Waiting for the task: (returnval){ [ 1647.667588] env[59577]: value = "task-1933774" [ 1647.667588] env[59577]: _type = "Task" [ 1647.667588] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1647.679102] env[59577]: DEBUG oslo_vmware.api [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Task: {'id': task-1933774, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1648.180019] env[59577]: DEBUG oslo_vmware.api [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Task: {'id': task-1933774, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07496} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1648.180019] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Deleted the datastore file {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1648.180019] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Deleted contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1648.180019] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Instance destroyed {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1648.180019] env[59577]: INFO nova.compute.manager [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Took 1.62 seconds to destroy the instance on the hypervisor. [ 1648.181772] env[59577]: DEBUG nova.compute.claims [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Aborting claim: {{(pid=59577) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1648.181945] env[59577]: DEBUG oslo_concurrency.lockutils [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1648.182158] env[59577]: DEBUG oslo_concurrency.lockutils [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1648.541371] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d478aaf-7656-4ccb-a744-c4e243e60801 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1648.547905] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93352886-eb61-47cf-a68c-a8f6d91be606 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1648.578639] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20b60457-6e1e-4cfe-924d-1bffe9550426 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1648.586389] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9cabbbb0-3d85-49ff-9d05-89ece20c04f8 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1648.599790] env[59577]: DEBUG nova.compute.provider_tree [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1648.612413] env[59577]: DEBUG nova.scheduler.client.report [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1648.625718] env[59577]: DEBUG oslo_concurrency.lockutils [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.443s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1648.626450] env[59577]: ERROR nova.compute.manager [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1648.626450] env[59577]: Faults: ['InvalidArgument'] [ 1648.626450] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Traceback (most recent call last): [ 1648.626450] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1648.626450] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] self.driver.spawn(context, instance, image_meta, [ 1648.626450] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1648.626450] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1648.626450] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1648.626450] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] self._fetch_image_if_missing(context, vi) [ 1648.626450] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1648.626450] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] image_cache(vi, tmp_image_ds_loc) [ 1648.626450] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1648.626795] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] vm_util.copy_virtual_disk( [ 1648.626795] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1648.626795] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] session._wait_for_task(vmdk_copy_task) [ 1648.626795] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1648.626795] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] return self.wait_for_task(task_ref) [ 1648.626795] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1648.626795] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] return evt.wait() [ 1648.626795] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1648.626795] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] result = hub.switch() [ 1648.626795] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1648.626795] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] return self.greenlet.switch() [ 1648.626795] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1648.626795] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] self.f(*self.args, **self.kw) [ 1648.627808] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1648.627808] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] raise exceptions.translate_fault(task_info.error) [ 1648.627808] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1648.627808] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Faults: ['InvalidArgument'] [ 1648.627808] env[59577]: ERROR nova.compute.manager [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] [ 1648.628114] env[59577]: DEBUG nova.compute.utils [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] VimFaultException {{(pid=59577) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1648.633519] env[59577]: DEBUG nova.compute.manager [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Build of instance 02000be7-32ff-4158-8ce7-02bcefe7d81c was re-scheduled: A specified parameter was not correct: fileType [ 1648.633519] env[59577]: Faults: ['InvalidArgument'] {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1648.634262] env[59577]: DEBUG nova.compute.manager [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Unplugging VIFs for instance {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1648.634262] env[59577]: DEBUG nova.compute.manager [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1648.634262] env[59577]: DEBUG nova.compute.manager [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Deallocating network for instance {{(pid=59577) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1648.634452] env[59577]: DEBUG nova.network.neutron [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] deallocate_for_instance() {{(pid=59577) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1649.468997] env[59577]: DEBUG nova.network.neutron [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Updating instance_info_cache with network_info: [] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1649.482661] env[59577]: INFO nova.compute.manager [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 02000be7-32ff-4158-8ce7-02bcefe7d81c] Took 0.85 seconds to deallocate network for instance. [ 1649.603057] env[59577]: INFO nova.scheduler.client.report [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Deleted allocations for instance 02000be7-32ff-4158-8ce7-02bcefe7d81c [ 1649.628726] env[59577]: DEBUG oslo_concurrency.lockutils [None req-2565fb66-6a4a-470e-aeca-762fab9d459b tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Lock "02000be7-32ff-4158-8ce7-02bcefe7d81c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 108.118s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1649.668711] env[59577]: DEBUG nova.compute.manager [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Starting instance... {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1649.736271] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1649.736271] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1649.736271] env[59577]: INFO nova.compute.claims [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1650.142564] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6777ca9e-5164-4896-8597-8beb3ddb0c0c {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1650.150771] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44e10e37-6611-40ba-ba29-01ba07bf93f2 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1650.190695] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d913d947-6e42-49db-8395-46707a215a58 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1650.198176] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a31149d6-38bb-4338-b6d7-ffff70cd1f78 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1650.215236] env[59577]: DEBUG nova.compute.provider_tree [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1650.227517] env[59577]: DEBUG nova.scheduler.client.report [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1650.250850] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.516s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1650.252466] env[59577]: DEBUG nova.compute.manager [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Start building networks asynchronously for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1650.290107] env[59577]: DEBUG nova.compute.utils [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Using /dev/sd instead of None {{(pid=59577) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1650.291578] env[59577]: DEBUG nova.compute.manager [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Allocating IP information in the background. {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1650.291578] env[59577]: DEBUG nova.network.neutron [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] allocate_for_instance() {{(pid=59577) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1650.301124] env[59577]: DEBUG nova.compute.manager [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Start building block device mappings for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1650.387606] env[59577]: DEBUG nova.compute.manager [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Start spawning the instance on the hypervisor. {{(pid=59577) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1650.395657] env[59577]: DEBUG nova.policy [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '966bef1e03f843dfa20a56df092ce8be', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '885cd3dd811041428a917d8feadee3b2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59577) authorize /opt/stack/nova/nova/policy.py:203}} [ 1650.417843] env[59577]: DEBUG nova.virt.hardware [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-12-17T13:42:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-12-17T13:42:33Z,direct_url=,disk_format='vmdk',id=d5e691af-5903-46f6-a589-e220c4e5798c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5cabf3c5743c484db7095e0ffc0e5d73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-12-17T13:42:34Z,virtual_size=,visibility=), allow threads: False {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1650.418086] env[59577]: DEBUG nova.virt.hardware [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Flavor limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1650.418287] env[59577]: DEBUG nova.virt.hardware [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Image limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1650.418476] env[59577]: DEBUG nova.virt.hardware [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Flavor pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1650.418659] env[59577]: DEBUG nova.virt.hardware [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Image pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1650.418815] env[59577]: DEBUG nova.virt.hardware [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1650.419070] env[59577]: DEBUG nova.virt.hardware [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1650.419292] env[59577]: DEBUG nova.virt.hardware [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1650.419490] env[59577]: DEBUG nova.virt.hardware [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Got 1 possible topologies {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1650.419673] env[59577]: DEBUG nova.virt.hardware [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1650.419855] env[59577]: DEBUG nova.virt.hardware [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1650.420825] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4c5f2ce-4797-441a-9623-adccd09a14d2 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1650.429354] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d30ed86-91cb-438f-9e20-7b602b821282 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1651.045418] env[59577]: DEBUG oslo_concurrency.lockutils [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Acquiring lock "8ed18ae2-2ba1-424c-b695-846afd7b3501" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1651.045710] env[59577]: DEBUG oslo_concurrency.lockutils [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Lock "8ed18ae2-2ba1-424c-b695-846afd7b3501" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1651.148816] env[59577]: DEBUG nova.network.neutron [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Successfully created port: 37e13eef-9f40-415c-b500-8af973e59e8e {{(pid=59577) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1652.015183] env[59577]: DEBUG nova.network.neutron [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Successfully updated port: 37e13eef-9f40-415c-b500-8af973e59e8e {{(pid=59577) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1652.029665] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Acquiring lock "refresh_cache-e7945a83-b063-42c4-9991-7f1e0545361d" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1652.029830] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Acquired lock "refresh_cache-e7945a83-b063-42c4-9991-7f1e0545361d" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1652.029983] env[59577]: DEBUG nova.network.neutron [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Building network info cache for instance {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1652.074105] env[59577]: DEBUG nova.network.neutron [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Instance cache missing network info. {{(pid=59577) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1652.302735] env[59577]: DEBUG nova.network.neutron [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Updating instance_info_cache with network_info: [{"id": "37e13eef-9f40-415c-b500-8af973e59e8e", "address": "fa:16:3e:36:cf:5c", "network": {"id": "434db171-07e6-464d-b0ee-79cea1ac09cc", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1758112308-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "885cd3dd811041428a917d8feadee3b2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "667a2e97-c1be-421d-9941-6b84c2629b43", "external-id": "nsx-vlan-transportzone-484", "segmentation_id": 484, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap37e13eef-9f", "ovs_interfaceid": "37e13eef-9f40-415c-b500-8af973e59e8e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1652.318698] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Releasing lock "refresh_cache-e7945a83-b063-42c4-9991-7f1e0545361d" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1652.319034] env[59577]: DEBUG nova.compute.manager [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Instance network_info: |[{"id": "37e13eef-9f40-415c-b500-8af973e59e8e", "address": "fa:16:3e:36:cf:5c", "network": {"id": "434db171-07e6-464d-b0ee-79cea1ac09cc", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1758112308-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "885cd3dd811041428a917d8feadee3b2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "667a2e97-c1be-421d-9941-6b84c2629b43", "external-id": "nsx-vlan-transportzone-484", "segmentation_id": 484, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap37e13eef-9f", "ovs_interfaceid": "37e13eef-9f40-415c-b500-8af973e59e8e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1652.319379] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:36:cf:5c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '667a2e97-c1be-421d-9941-6b84c2629b43', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '37e13eef-9f40-415c-b500-8af973e59e8e', 'vif_model': 'vmxnet3'}] {{(pid=59577) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1652.327215] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Creating folder: Project (885cd3dd811041428a917d8feadee3b2). Parent ref: group-v398749. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1652.328028] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5b3f810a-28dd-4701-9eb9-d9c37e182d26 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1652.338846] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Created folder: Project (885cd3dd811041428a917d8feadee3b2) in parent group-v398749. [ 1652.339080] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Creating folder: Instances. Parent ref: group-v398779. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1652.339341] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-33bd2d99-9657-4897-bb8e-e40046b872de {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1652.350061] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Created folder: Instances in parent group-v398779. [ 1652.350306] env[59577]: DEBUG oslo.service.loopingcall [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59577) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1652.350489] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Creating VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1652.350687] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c8d7ddc2-c6c9-4223-84ae-59b86ee997c5 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1652.378691] env[59577]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1652.378691] env[59577]: value = "task-1933777" [ 1652.378691] env[59577]: _type = "Task" [ 1652.378691] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1652.389760] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933777, 'name': CreateVM_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1652.417038] env[59577]: DEBUG nova.compute.manager [req-a52169db-32c1-43ca-9df9-14356492288a req-80a4b7b0-8ea3-4bc6-ae4a-89ed680ae854 service nova] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Received event network-vif-plugged-37e13eef-9f40-415c-b500-8af973e59e8e {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1652.417038] env[59577]: DEBUG oslo_concurrency.lockutils [req-a52169db-32c1-43ca-9df9-14356492288a req-80a4b7b0-8ea3-4bc6-ae4a-89ed680ae854 service nova] Acquiring lock "e7945a83-b063-42c4-9991-7f1e0545361d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1652.417038] env[59577]: DEBUG oslo_concurrency.lockutils [req-a52169db-32c1-43ca-9df9-14356492288a req-80a4b7b0-8ea3-4bc6-ae4a-89ed680ae854 service nova] Lock "e7945a83-b063-42c4-9991-7f1e0545361d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1652.417038] env[59577]: DEBUG oslo_concurrency.lockutils [req-a52169db-32c1-43ca-9df9-14356492288a req-80a4b7b0-8ea3-4bc6-ae4a-89ed680ae854 service nova] Lock "e7945a83-b063-42c4-9991-7f1e0545361d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1652.417330] env[59577]: DEBUG nova.compute.manager [req-a52169db-32c1-43ca-9df9-14356492288a req-80a4b7b0-8ea3-4bc6-ae4a-89ed680ae854 service nova] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] No waiting events found dispatching network-vif-plugged-37e13eef-9f40-415c-b500-8af973e59e8e {{(pid=59577) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1652.417330] env[59577]: WARNING nova.compute.manager [req-a52169db-32c1-43ca-9df9-14356492288a req-80a4b7b0-8ea3-4bc6-ae4a-89ed680ae854 service nova] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Received unexpected event network-vif-plugged-37e13eef-9f40-415c-b500-8af973e59e8e for instance with vm_state building and task_state spawning. [ 1652.896821] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933777, 'name': CreateVM_Task, 'duration_secs': 0.416146} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1652.896821] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Created VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1652.896821] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1652.896821] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1652.896821] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1652.898252] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a42dddb9-f82b-44c1-aeab-2b6fd5ecbd10 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1652.902199] env[59577]: DEBUG oslo_vmware.api [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Waiting for the task: (returnval){ [ 1652.902199] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]525c965f-5b85-50c2-f3d1-d5a5e797fbc1" [ 1652.902199] env[59577]: _type = "Task" [ 1652.902199] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1652.910510] env[59577]: DEBUG oslo_vmware.api [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]525c965f-5b85-50c2-f3d1-d5a5e797fbc1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1653.114526] env[59577]: DEBUG oslo_concurrency.lockutils [None req-e7736f56-1efe-4470-808a-6f14c5fdbc33 tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Acquiring lock "659b62e3-0bb2-46fb-aa6a-faef4883bdc1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1653.115321] env[59577]: DEBUG oslo_concurrency.lockutils [None req-e7736f56-1efe-4470-808a-6f14c5fdbc33 tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Lock "659b62e3-0bb2-46fb-aa6a-faef4883bdc1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.003s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1653.419017] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1653.422634] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Processing image d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1653.422941] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1654.618558] env[59577]: DEBUG nova.compute.manager [req-b216543f-8a69-4c37-8e63-b913ff23f8d7 req-a19c123c-9903-4121-8675-f97b36bd03de service nova] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Received event network-changed-37e13eef-9f40-415c-b500-8af973e59e8e {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1654.618818] env[59577]: DEBUG nova.compute.manager [req-b216543f-8a69-4c37-8e63-b913ff23f8d7 req-a19c123c-9903-4121-8675-f97b36bd03de service nova] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Refreshing instance network info cache due to event network-changed-37e13eef-9f40-415c-b500-8af973e59e8e. {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1654.618992] env[59577]: DEBUG oslo_concurrency.lockutils [req-b216543f-8a69-4c37-8e63-b913ff23f8d7 req-a19c123c-9903-4121-8675-f97b36bd03de service nova] Acquiring lock "refresh_cache-e7945a83-b063-42c4-9991-7f1e0545361d" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1654.619143] env[59577]: DEBUG oslo_concurrency.lockutils [req-b216543f-8a69-4c37-8e63-b913ff23f8d7 req-a19c123c-9903-4121-8675-f97b36bd03de service nova] Acquired lock "refresh_cache-e7945a83-b063-42c4-9991-7f1e0545361d" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1654.619305] env[59577]: DEBUG nova.network.neutron [req-b216543f-8a69-4c37-8e63-b913ff23f8d7 req-a19c123c-9903-4121-8675-f97b36bd03de service nova] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Refreshing network info cache for port 37e13eef-9f40-415c-b500-8af973e59e8e {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1654.973252] env[59577]: DEBUG nova.network.neutron [req-b216543f-8a69-4c37-8e63-b913ff23f8d7 req-a19c123c-9903-4121-8675-f97b36bd03de service nova] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Updated VIF entry in instance network info cache for port 37e13eef-9f40-415c-b500-8af973e59e8e. {{(pid=59577) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1654.973569] env[59577]: DEBUG nova.network.neutron [req-b216543f-8a69-4c37-8e63-b913ff23f8d7 req-a19c123c-9903-4121-8675-f97b36bd03de service nova] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Updating instance_info_cache with network_info: [{"id": "37e13eef-9f40-415c-b500-8af973e59e8e", "address": "fa:16:3e:36:cf:5c", "network": {"id": "434db171-07e6-464d-b0ee-79cea1ac09cc", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1758112308-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "885cd3dd811041428a917d8feadee3b2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "667a2e97-c1be-421d-9941-6b84c2629b43", "external-id": "nsx-vlan-transportzone-484", "segmentation_id": 484, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap37e13eef-9f", "ovs_interfaceid": "37e13eef-9f40-415c-b500-8af973e59e8e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1654.984230] env[59577]: DEBUG oslo_concurrency.lockutils [req-b216543f-8a69-4c37-8e63-b913ff23f8d7 req-a19c123c-9903-4121-8675-f97b36bd03de service nova] Releasing lock "refresh_cache-e7945a83-b063-42c4-9991-7f1e0545361d" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1656.968592] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5ce53f05-537c-4494-b7a4-5c6c39920506 tempest-ServerGroupTestJSON-834758818 tempest-ServerGroupTestJSON-834758818-project-member] Acquiring lock "bc569bae-c099-4437-8683-5af75ef5f106" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1656.968863] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5ce53f05-537c-4494-b7a4-5c6c39920506 tempest-ServerGroupTestJSON-834758818 tempest-ServerGroupTestJSON-834758818-project-member] Lock "bc569bae-c099-4437-8683-5af75ef5f106" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1662.012583] env[59577]: DEBUG oslo_concurrency.lockutils [None req-40841827-bd04-4c51-a0b7-417633b02639 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Acquiring lock "86651832-02db-4181-8a9e-11da5f017f65" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1662.012961] env[59577]: DEBUG oslo_concurrency.lockutils [None req-40841827-bd04-4c51-a0b7-417633b02639 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Lock "86651832-02db-4181-8a9e-11da5f017f65" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1664.045558] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1667.045213] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1667.045546] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1670.045596] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1670.045866] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59577) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1671.044752] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1672.039694] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1672.044248] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1672.073343] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1672.073575] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1672.073743] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1672.073899] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59577) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1672.075178] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-713695e8-f1e8-4f56-b4f4-0e6f1ad1e62a {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1672.084202] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1277a587-07d3-4492-8595-58dee2ecd163 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1672.098243] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d26f9663-01ce-4331-82d4-b4e183e2c6c4 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1672.104360] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3bb8a39-2999-4588-b2c2-8b4edfb96d35 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1672.133886] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181319MB free_disk=175GB free_vcpus=48 pci_devices=None {{(pid=59577) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1672.133886] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1672.134118] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1672.206572] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 3070aef7-432d-4f92-80d5-18efd8cceec3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1672.206743] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 6855f7a9-0dc0-41e1-900b-112181064d7d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1672.206871] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 145736d0-0f14-4ec3-ada2-ddb1fe6f271a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1672.206993] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance cc3276aa-0d5a-4a14-ae90-e20a1b823bd3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1672.207135] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance ee50624e-74d6-4afc-9fba-c541f1b83554 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1672.207254] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance e9b9f5db-afac-494e-9850-c0d82f26fc68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1672.207372] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 1a375c37-fcec-4442-827d-103352e81035 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1672.207488] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 47aac36c-3f70-40a8-ab60-cebba86d3f85 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1672.207600] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance b9d0daac-02e6-4862-b3de-64223d5a4a76 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1672.207751] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance e7945a83-b063-42c4-9991-7f1e0545361d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1672.250526] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 1ebb8847-1932-4ed6-8e56-bf48952cfc9c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1672.274364] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance b8002da2-eecd-490a-a34b-c651c28c57fc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1672.284615] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 077b8c8d-ee7e-495b-a7f7-676fe7c70f83 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1672.294655] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 640c1048-dca1-4fbd-889b-cd8aa23eb3f2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1672.304982] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance d25dbbad-94bb-4147-aa56-91aaab4ed077 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1672.315512] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 7b8c78af-c9f9-4bff-a667-6faa0fb7c482 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1672.325549] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 16a36155-12ef-40b2-b94d-db4619ac2f4b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1672.337057] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance cefa22cd-9d7a-4b86-9ec5-bb9f005b42e0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1672.346948] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance eaa781b6-6542-495d-8430-73416444d972 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1672.356805] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 8ed18ae2-2ba1-424c-b695-846afd7b3501 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1672.366913] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 659b62e3-0bb2-46fb-aa6a-faef4883bdc1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1672.379512] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance bc569bae-c099-4437-8683-5af75ef5f106 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1672.389359] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 86651832-02db-4181-8a9e-11da5f017f65 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1672.389568] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1672.389716] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1672.671662] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b30fa4de-430b-4af4-ae32-8ef4dca67603 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1672.679870] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb2787d6-17eb-42a7-8490-44056244c90a {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1672.710711] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acd290a4-e7f0-47f5-82e1-662d7bb1cf97 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1672.718253] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77f917f0-09fd-4725-b4f1-168038b7a456 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1672.731688] env[59577]: DEBUG nova.compute.provider_tree [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1672.741648] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1672.757509] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59577) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1672.757618] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.624s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1673.757758] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1676.046211] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1676.046548] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Starting heal instance info cache {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1676.046548] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Rebuilding the list of instances to heal {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1676.065854] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1676.066052] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1676.066151] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1676.066279] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1676.066402] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1676.066524] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1676.066642] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 1a375c37-fcec-4442-827d-103352e81035] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1676.066759] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1676.066875] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1676.066989] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1676.067119] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Didn't find any instances for network info cache update. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1696.022698] env[59577]: WARNING oslo_vmware.rw_handles [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1696.022698] env[59577]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1696.022698] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1696.022698] env[59577]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1696.022698] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1696.022698] env[59577]: ERROR oslo_vmware.rw_handles response.begin() [ 1696.022698] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1696.022698] env[59577]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1696.022698] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1696.022698] env[59577]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1696.022698] env[59577]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1696.022698] env[59577]: ERROR oslo_vmware.rw_handles [ 1696.022698] env[59577]: DEBUG nova.virt.vmwareapi.images [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Downloaded image file data d5e691af-5903-46f6-a589-e220c4e5798c to vmware_temp/fe477266-7b32-4bef-bcbb-8a3aee556ead/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk on the data store datastore1 {{(pid=59577) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1696.024618] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Caching image {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1696.024840] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Copying Virtual Disk [datastore1] vmware_temp/fe477266-7b32-4bef-bcbb-8a3aee556ead/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk to [datastore1] vmware_temp/fe477266-7b32-4bef-bcbb-8a3aee556ead/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk {{(pid=59577) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1696.025126] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-fcd5dda2-3c67-46bf-95eb-0291e6b4a6b9 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1696.033126] env[59577]: DEBUG oslo_vmware.api [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Waiting for the task: (returnval){ [ 1696.033126] env[59577]: value = "task-1933778" [ 1696.033126] env[59577]: _type = "Task" [ 1696.033126] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1696.041172] env[59577]: DEBUG oslo_vmware.api [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Task: {'id': task-1933778, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1696.543466] env[59577]: DEBUG oslo_vmware.exceptions [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Fault InvalidArgument not matched. {{(pid=59577) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1696.543750] env[59577]: DEBUG oslo_concurrency.lockutils [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1696.544471] env[59577]: ERROR nova.compute.manager [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1696.544471] env[59577]: Faults: ['InvalidArgument'] [ 1696.544471] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Traceback (most recent call last): [ 1696.544471] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1696.544471] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] yield resources [ 1696.544471] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1696.544471] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] self.driver.spawn(context, instance, image_meta, [ 1696.544471] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1696.544471] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1696.544471] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1696.544471] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] self._fetch_image_if_missing(context, vi) [ 1696.544471] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1696.544821] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] image_cache(vi, tmp_image_ds_loc) [ 1696.544821] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1696.544821] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] vm_util.copy_virtual_disk( [ 1696.544821] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1696.544821] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] session._wait_for_task(vmdk_copy_task) [ 1696.544821] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1696.544821] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] return self.wait_for_task(task_ref) [ 1696.544821] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1696.544821] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] return evt.wait() [ 1696.544821] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1696.544821] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] result = hub.switch() [ 1696.544821] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1696.544821] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] return self.greenlet.switch() [ 1696.545160] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1696.545160] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] self.f(*self.args, **self.kw) [ 1696.545160] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1696.545160] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] raise exceptions.translate_fault(task_info.error) [ 1696.545160] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1696.545160] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Faults: ['InvalidArgument'] [ 1696.545160] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] [ 1696.545160] env[59577]: INFO nova.compute.manager [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Terminating instance [ 1696.546197] env[59577]: DEBUG oslo_concurrency.lockutils [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1696.546385] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1696.546616] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e6fa06ae-8c93-4e76-9970-db96e2f3dd59 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1696.548989] env[59577]: DEBUG nova.compute.manager [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Start destroying the instance on the hypervisor. {{(pid=59577) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1696.549196] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Destroying instance {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1696.549896] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d10b724-f22a-4eb1-834e-6ae03e81acd9 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1696.556283] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Unregistering the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1696.556488] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e0c1d0b8-8d5d-4dcf-89db-e4e5132742a6 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1696.575203] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1696.575393] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59577) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1696.576118] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6f41a1d6-a4f5-4540-8871-2fdeec8405cb {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1696.581292] env[59577]: DEBUG oslo_vmware.api [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Waiting for the task: (returnval){ [ 1696.581292] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]5262d884-eb46-8932-4130-09feae33a226" [ 1696.581292] env[59577]: _type = "Task" [ 1696.581292] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1696.588698] env[59577]: DEBUG oslo_vmware.api [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]5262d884-eb46-8932-4130-09feae33a226, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1696.652073] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Unregistered the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1696.652073] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Deleting contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1696.652073] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Deleting the datastore file [datastore1] 3070aef7-432d-4f92-80d5-18efd8cceec3 {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1696.652073] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2ec2d1cc-5d3c-4ab8-b1a9-44ecec261b28 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1696.658366] env[59577]: DEBUG oslo_vmware.api [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Waiting for the task: (returnval){ [ 1696.658366] env[59577]: value = "task-1933780" [ 1696.658366] env[59577]: _type = "Task" [ 1696.658366] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1696.666398] env[59577]: DEBUG oslo_vmware.api [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Task: {'id': task-1933780, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1697.091291] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Preparing fetch location {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1697.091573] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Creating directory with path [datastore1] vmware_temp/f0df05bb-9c8c-434d-826c-442cca14db65/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1697.091814] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-016a0a0e-660a-48e5-9e4a-db8c24f16fda {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1697.103480] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Created directory with path [datastore1] vmware_temp/f0df05bb-9c8c-434d-826c-442cca14db65/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1697.103699] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Fetch image to [datastore1] vmware_temp/f0df05bb-9c8c-434d-826c-442cca14db65/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1697.103872] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to [datastore1] vmware_temp/f0df05bb-9c8c-434d-826c-442cca14db65/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk on the data store datastore1 {{(pid=59577) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1697.104607] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5007dff-944d-4240-959e-2dd6e1d6bf5f {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1697.111406] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdea1a04-0861-4b51-832d-22f80725e28c {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1697.120248] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-794624af-a906-4318-b767-649f4b35e3c0 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1697.151948] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c3f8faa-080d-4da3-8596-5dd6a7da7f44 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1697.159525] env[59577]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e4e21f38-53b8-4116-b4f7-f6f2f12a8252 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1697.167343] env[59577]: DEBUG oslo_vmware.api [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Task: {'id': task-1933780, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074128} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1697.167575] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Deleted the datastore file {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1697.167755] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Deleted contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1697.167927] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Instance destroyed {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1697.168110] env[59577]: INFO nova.compute.manager [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1697.170193] env[59577]: DEBUG nova.compute.claims [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Aborting claim: {{(pid=59577) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1697.170359] env[59577]: DEBUG oslo_concurrency.lockutils [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1697.170564] env[59577]: DEBUG oslo_concurrency.lockutils [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1697.182258] env[59577]: DEBUG nova.virt.vmwareapi.images [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to the data store datastore1 {{(pid=59577) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1697.240076] env[59577]: DEBUG oslo_vmware.rw_handles [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f0df05bb-9c8c-434d-826c-442cca14db65/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59577) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1697.298331] env[59577]: DEBUG oslo_vmware.rw_handles [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Completed reading data from the image iterator. {{(pid=59577) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1697.298519] env[59577]: DEBUG oslo_vmware.rw_handles [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f0df05bb-9c8c-434d-826c-442cca14db65/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59577) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1697.501783] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-481ce118-39b7-406e-b982-a449970a383d {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1697.509435] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73941683-0d2b-4c3b-8b75-6bae4af24f2b {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1697.537842] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-032be727-ea04-43c7-b8eb-064856c1b8c0 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1697.544528] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c099458-8660-4a30-b3c5-5332b9758bb4 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1697.557800] env[59577]: DEBUG nova.compute.provider_tree [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1697.566263] env[59577]: DEBUG nova.scheduler.client.report [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1697.581472] env[59577]: DEBUG oslo_concurrency.lockutils [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.411s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1697.582007] env[59577]: ERROR nova.compute.manager [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1697.582007] env[59577]: Faults: ['InvalidArgument'] [ 1697.582007] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Traceback (most recent call last): [ 1697.582007] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1697.582007] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] self.driver.spawn(context, instance, image_meta, [ 1697.582007] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1697.582007] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1697.582007] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1697.582007] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] self._fetch_image_if_missing(context, vi) [ 1697.582007] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1697.582007] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] image_cache(vi, tmp_image_ds_loc) [ 1697.582007] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1697.582363] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] vm_util.copy_virtual_disk( [ 1697.582363] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1697.582363] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] session._wait_for_task(vmdk_copy_task) [ 1697.582363] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1697.582363] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] return self.wait_for_task(task_ref) [ 1697.582363] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1697.582363] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] return evt.wait() [ 1697.582363] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1697.582363] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] result = hub.switch() [ 1697.582363] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1697.582363] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] return self.greenlet.switch() [ 1697.582363] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1697.582363] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] self.f(*self.args, **self.kw) [ 1697.582727] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1697.582727] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] raise exceptions.translate_fault(task_info.error) [ 1697.582727] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1697.582727] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Faults: ['InvalidArgument'] [ 1697.582727] env[59577]: ERROR nova.compute.manager [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] [ 1697.582727] env[59577]: DEBUG nova.compute.utils [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] VimFaultException {{(pid=59577) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1697.583989] env[59577]: DEBUG nova.compute.manager [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Build of instance 3070aef7-432d-4f92-80d5-18efd8cceec3 was re-scheduled: A specified parameter was not correct: fileType [ 1697.583989] env[59577]: Faults: ['InvalidArgument'] {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1697.584361] env[59577]: DEBUG nova.compute.manager [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Unplugging VIFs for instance {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1697.584534] env[59577]: DEBUG nova.compute.manager [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1697.584711] env[59577]: DEBUG nova.compute.manager [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Deallocating network for instance {{(pid=59577) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1697.584874] env[59577]: DEBUG nova.network.neutron [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] deallocate_for_instance() {{(pid=59577) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1697.927309] env[59577]: DEBUG nova.network.neutron [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Updating instance_info_cache with network_info: [] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1697.943060] env[59577]: INFO nova.compute.manager [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] [instance: 3070aef7-432d-4f92-80d5-18efd8cceec3] Took 0.36 seconds to deallocate network for instance. [ 1698.045783] env[59577]: INFO nova.scheduler.client.report [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Deleted allocations for instance 3070aef7-432d-4f92-80d5-18efd8cceec3 [ 1698.066050] env[59577]: DEBUG oslo_concurrency.lockutils [None req-68584ba2-fc0c-4068-b0fb-3df791f0d62b tempest-ImagesNegativeTestJSON-1916757989 tempest-ImagesNegativeTestJSON-1916757989-project-member] Lock "3070aef7-432d-4f92-80d5-18efd8cceec3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 154.260s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1698.089229] env[59577]: DEBUG nova.compute.manager [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Starting instance... {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1698.138970] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1698.139259] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1698.140990] env[59577]: INFO nova.compute.claims [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1698.442785] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad126bd0-c111-4de3-bfe8-f0abff5695cf {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1698.450871] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2cbc302-7e21-497f-b0d5-a44fb9b2ee70 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1698.480597] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-887cff0e-f97d-4ce2-91c7-5d96fe44e14d {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1698.487654] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30752a4f-98e0-4aff-afdd-7951d68caf4f {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1698.500404] env[59577]: DEBUG nova.compute.provider_tree [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1698.509113] env[59577]: DEBUG nova.scheduler.client.report [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1698.521406] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.382s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1698.521902] env[59577]: DEBUG nova.compute.manager [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Start building networks asynchronously for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1698.553449] env[59577]: DEBUG nova.compute.utils [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Using /dev/sd instead of None {{(pid=59577) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1698.554977] env[59577]: DEBUG nova.compute.manager [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Allocating IP information in the background. {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1698.555166] env[59577]: DEBUG nova.network.neutron [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] allocate_for_instance() {{(pid=59577) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1698.564493] env[59577]: DEBUG nova.compute.manager [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Start building block device mappings for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1698.614047] env[59577]: DEBUG nova.policy [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1206828d9f13442b9b86ff242850f4b3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '76e8db4ae1a3491d8a0cc1b891807eb7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59577) authorize /opt/stack/nova/nova/policy.py:203}} [ 1698.630851] env[59577]: DEBUG nova.compute.manager [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Start spawning the instance on the hypervisor. {{(pid=59577) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1698.650823] env[59577]: DEBUG nova.virt.hardware [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-12-17T13:42:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-12-17T13:42:33Z,direct_url=,disk_format='vmdk',id=d5e691af-5903-46f6-a589-e220c4e5798c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5cabf3c5743c484db7095e0ffc0e5d73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-12-17T13:42:34Z,virtual_size=,visibility=), allow threads: False {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1698.651091] env[59577]: DEBUG nova.virt.hardware [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Flavor limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1698.651408] env[59577]: DEBUG nova.virt.hardware [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Image limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1698.651486] env[59577]: DEBUG nova.virt.hardware [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Flavor pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1698.651639] env[59577]: DEBUG nova.virt.hardware [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Image pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1698.651797] env[59577]: DEBUG nova.virt.hardware [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1698.652015] env[59577]: DEBUG nova.virt.hardware [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1698.652180] env[59577]: DEBUG nova.virt.hardware [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1698.652343] env[59577]: DEBUG nova.virt.hardware [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Got 1 possible topologies {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1698.652505] env[59577]: DEBUG nova.virt.hardware [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1698.652679] env[59577]: DEBUG nova.virt.hardware [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1698.653583] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87e1f214-0fec-4a79-ab96-9c5a36f94853 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1698.661284] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a258860e-542d-46b2-ae46-49ca75f993d1 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1699.001952] env[59577]: DEBUG nova.network.neutron [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Successfully created port: 2096eda8-b035-40d8-bb8c-b2f2fa309777 {{(pid=59577) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1699.795949] env[59577]: DEBUG nova.network.neutron [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Successfully updated port: 2096eda8-b035-40d8-bb8c-b2f2fa309777 {{(pid=59577) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1699.808842] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Acquiring lock "refresh_cache-1ebb8847-1932-4ed6-8e56-bf48952cfc9c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1699.808996] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Acquired lock "refresh_cache-1ebb8847-1932-4ed6-8e56-bf48952cfc9c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1699.809157] env[59577]: DEBUG nova.network.neutron [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Building network info cache for instance {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1699.844698] env[59577]: DEBUG nova.network.neutron [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Instance cache missing network info. {{(pid=59577) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1700.060411] env[59577]: DEBUG nova.network.neutron [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Updating instance_info_cache with network_info: [{"id": "2096eda8-b035-40d8-bb8c-b2f2fa309777", "address": "fa:16:3e:c1:97:dc", "network": {"id": "ab9e3535-879e-4005-9c60-72caca83713b", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2113597816-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "76e8db4ae1a3491d8a0cc1b891807eb7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a071ecf4-e713-4f97-9271-8c17952f6dee", "external-id": "nsx-vlan-transportzone-23", "segmentation_id": 23, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2096eda8-b0", "ovs_interfaceid": "2096eda8-b035-40d8-bb8c-b2f2fa309777", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1700.073409] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Releasing lock "refresh_cache-1ebb8847-1932-4ed6-8e56-bf48952cfc9c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1700.073761] env[59577]: DEBUG nova.compute.manager [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Instance network_info: |[{"id": "2096eda8-b035-40d8-bb8c-b2f2fa309777", "address": "fa:16:3e:c1:97:dc", "network": {"id": "ab9e3535-879e-4005-9c60-72caca83713b", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2113597816-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "76e8db4ae1a3491d8a0cc1b891807eb7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a071ecf4-e713-4f97-9271-8c17952f6dee", "external-id": "nsx-vlan-transportzone-23", "segmentation_id": 23, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2096eda8-b0", "ovs_interfaceid": "2096eda8-b035-40d8-bb8c-b2f2fa309777", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1700.074155] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c1:97:dc', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a071ecf4-e713-4f97-9271-8c17952f6dee', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2096eda8-b035-40d8-bb8c-b2f2fa309777', 'vif_model': 'vmxnet3'}] {{(pid=59577) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1700.082408] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Creating folder: Project (76e8db4ae1a3491d8a0cc1b891807eb7). Parent ref: group-v398749. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1700.084742] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bf60f3be-60b1-4608-8f02-4ad87849a66f {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1700.095158] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Created folder: Project (76e8db4ae1a3491d8a0cc1b891807eb7) in parent group-v398749. [ 1700.095350] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Creating folder: Instances. Parent ref: group-v398782. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1700.095562] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-eb5874c8-faf8-4d57-b2ff-e7413456cdf0 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1700.104256] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Created folder: Instances in parent group-v398782. [ 1700.104430] env[59577]: DEBUG oslo.service.loopingcall [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59577) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1700.104591] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Creating VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1700.104801] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c72648f3-4883-476f-9278-7703e1871684 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1700.125695] env[59577]: DEBUG nova.compute.manager [req-899a2548-28b6-45e4-ac6d-93074a3b7d2c req-10c329bc-1133-4eb6-8cca-a785a5abdcd2 service nova] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Received event network-vif-plugged-2096eda8-b035-40d8-bb8c-b2f2fa309777 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1700.126044] env[59577]: DEBUG oslo_concurrency.lockutils [req-899a2548-28b6-45e4-ac6d-93074a3b7d2c req-10c329bc-1133-4eb6-8cca-a785a5abdcd2 service nova] Acquiring lock "1ebb8847-1932-4ed6-8e56-bf48952cfc9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1700.126254] env[59577]: DEBUG oslo_concurrency.lockutils [req-899a2548-28b6-45e4-ac6d-93074a3b7d2c req-10c329bc-1133-4eb6-8cca-a785a5abdcd2 service nova] Lock "1ebb8847-1932-4ed6-8e56-bf48952cfc9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1700.126469] env[59577]: DEBUG oslo_concurrency.lockutils [req-899a2548-28b6-45e4-ac6d-93074a3b7d2c req-10c329bc-1133-4eb6-8cca-a785a5abdcd2 service nova] Lock "1ebb8847-1932-4ed6-8e56-bf48952cfc9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1700.126682] env[59577]: DEBUG nova.compute.manager [req-899a2548-28b6-45e4-ac6d-93074a3b7d2c req-10c329bc-1133-4eb6-8cca-a785a5abdcd2 service nova] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] No waiting events found dispatching network-vif-plugged-2096eda8-b035-40d8-bb8c-b2f2fa309777 {{(pid=59577) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1700.126905] env[59577]: WARNING nova.compute.manager [req-899a2548-28b6-45e4-ac6d-93074a3b7d2c req-10c329bc-1133-4eb6-8cca-a785a5abdcd2 service nova] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Received unexpected event network-vif-plugged-2096eda8-b035-40d8-bb8c-b2f2fa309777 for instance with vm_state building and task_state spawning. [ 1700.127124] env[59577]: DEBUG nova.compute.manager [req-899a2548-28b6-45e4-ac6d-93074a3b7d2c req-10c329bc-1133-4eb6-8cca-a785a5abdcd2 service nova] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Received event network-changed-2096eda8-b035-40d8-bb8c-b2f2fa309777 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1700.127328] env[59577]: DEBUG nova.compute.manager [req-899a2548-28b6-45e4-ac6d-93074a3b7d2c req-10c329bc-1133-4eb6-8cca-a785a5abdcd2 service nova] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Refreshing instance network info cache due to event network-changed-2096eda8-b035-40d8-bb8c-b2f2fa309777. {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1700.127554] env[59577]: DEBUG oslo_concurrency.lockutils [req-899a2548-28b6-45e4-ac6d-93074a3b7d2c req-10c329bc-1133-4eb6-8cca-a785a5abdcd2 service nova] Acquiring lock "refresh_cache-1ebb8847-1932-4ed6-8e56-bf48952cfc9c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1700.127737] env[59577]: DEBUG oslo_concurrency.lockutils [req-899a2548-28b6-45e4-ac6d-93074a3b7d2c req-10c329bc-1133-4eb6-8cca-a785a5abdcd2 service nova] Acquired lock "refresh_cache-1ebb8847-1932-4ed6-8e56-bf48952cfc9c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1700.127943] env[59577]: DEBUG nova.network.neutron [req-899a2548-28b6-45e4-ac6d-93074a3b7d2c req-10c329bc-1133-4eb6-8cca-a785a5abdcd2 service nova] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Refreshing network info cache for port 2096eda8-b035-40d8-bb8c-b2f2fa309777 {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1700.139175] env[59577]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1700.139175] env[59577]: value = "task-1933783" [ 1700.139175] env[59577]: _type = "Task" [ 1700.139175] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1700.148313] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933783, 'name': CreateVM_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1700.436176] env[59577]: DEBUG nova.network.neutron [req-899a2548-28b6-45e4-ac6d-93074a3b7d2c req-10c329bc-1133-4eb6-8cca-a785a5abdcd2 service nova] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Updated VIF entry in instance network info cache for port 2096eda8-b035-40d8-bb8c-b2f2fa309777. {{(pid=59577) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1700.436176] env[59577]: DEBUG nova.network.neutron [req-899a2548-28b6-45e4-ac6d-93074a3b7d2c req-10c329bc-1133-4eb6-8cca-a785a5abdcd2 service nova] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Updating instance_info_cache with network_info: [{"id": "2096eda8-b035-40d8-bb8c-b2f2fa309777", "address": "fa:16:3e:c1:97:dc", "network": {"id": "ab9e3535-879e-4005-9c60-72caca83713b", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2113597816-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "76e8db4ae1a3491d8a0cc1b891807eb7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a071ecf4-e713-4f97-9271-8c17952f6dee", "external-id": "nsx-vlan-transportzone-23", "segmentation_id": 23, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2096eda8-b0", "ovs_interfaceid": "2096eda8-b035-40d8-bb8c-b2f2fa309777", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1700.450886] env[59577]: DEBUG oslo_concurrency.lockutils [req-899a2548-28b6-45e4-ac6d-93074a3b7d2c req-10c329bc-1133-4eb6-8cca-a785a5abdcd2 service nova] Releasing lock "refresh_cache-1ebb8847-1932-4ed6-8e56-bf48952cfc9c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1700.650433] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933783, 'name': CreateVM_Task, 'duration_secs': 0.330067} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1700.650610] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Created VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1700.651283] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1700.651441] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1700.651739] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1700.651988] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-baf4402b-f8f0-4e74-a49e-39daec9d6a2a {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1700.656927] env[59577]: DEBUG oslo_vmware.api [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Waiting for the task: (returnval){ [ 1700.656927] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]5279c363-29f9-66b2-8e5d-0267ac3ac122" [ 1700.656927] env[59577]: _type = "Task" [ 1700.656927] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1700.667793] env[59577]: DEBUG oslo_vmware.api [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]5279c363-29f9-66b2-8e5d-0267ac3ac122, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1701.167170] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1701.167447] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Processing image d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1701.167991] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1726.045556] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1727.045705] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1728.045439] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1732.046540] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1732.046864] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59577) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1733.039511] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1733.044389] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1734.045411] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1734.055643] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1734.055876] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1734.056058] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1734.056222] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59577) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1734.057658] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d56dd69c-0c74-49b2-85cb-278f84d104fd {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1734.066336] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3752b89-a9d5-441e-99fe-ad7f669456ec {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1734.080012] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e17bdf71-3cc5-440d-b873-5d88ce2804c4 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1734.087255] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e1c858c-dc46-41ba-8382-582a9b6a8f83 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1734.118439] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181292MB free_disk=175GB free_vcpus=48 pci_devices=None {{(pid=59577) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1734.118608] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1734.118787] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1734.186115] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 6855f7a9-0dc0-41e1-900b-112181064d7d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1734.186289] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 145736d0-0f14-4ec3-ada2-ddb1fe6f271a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1734.186420] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance cc3276aa-0d5a-4a14-ae90-e20a1b823bd3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1734.186582] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance ee50624e-74d6-4afc-9fba-c541f1b83554 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1734.186686] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance e9b9f5db-afac-494e-9850-c0d82f26fc68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1734.186801] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 1a375c37-fcec-4442-827d-103352e81035 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1734.186917] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 47aac36c-3f70-40a8-ab60-cebba86d3f85 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1734.187042] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance b9d0daac-02e6-4862-b3de-64223d5a4a76 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1734.187161] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance e7945a83-b063-42c4-9991-7f1e0545361d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1734.187289] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 1ebb8847-1932-4ed6-8e56-bf48952cfc9c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1734.197914] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance b8002da2-eecd-490a-a34b-c651c28c57fc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1734.208105] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 077b8c8d-ee7e-495b-a7f7-676fe7c70f83 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1734.217660] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 640c1048-dca1-4fbd-889b-cd8aa23eb3f2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1734.227691] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance d25dbbad-94bb-4147-aa56-91aaab4ed077 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1734.237085] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 7b8c78af-c9f9-4bff-a667-6faa0fb7c482 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1734.251228] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 16a36155-12ef-40b2-b94d-db4619ac2f4b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1734.261958] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance cefa22cd-9d7a-4b86-9ec5-bb9f005b42e0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1734.271874] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance eaa781b6-6542-495d-8430-73416444d972 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1734.282882] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 8ed18ae2-2ba1-424c-b695-846afd7b3501 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1734.291973] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 659b62e3-0bb2-46fb-aa6a-faef4883bdc1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1734.301513] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance bc569bae-c099-4437-8683-5af75ef5f106 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1734.310963] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 86651832-02db-4181-8a9e-11da5f017f65 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1734.311205] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1734.311365] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1734.552236] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4da62b25-983c-4edb-928a-c63e8d865dc9 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1734.559928] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0603df1-49c7-4e83-b237-87590d969a76 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1734.589487] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fa35ace-5e1b-42ed-bd9e-936088aef79f {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1734.596045] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35eb369f-142a-4dca-a076-6eaec193e5cd {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1734.608635] env[59577]: DEBUG nova.compute.provider_tree [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1734.617765] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1734.630071] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59577) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1734.630260] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.511s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1735.630555] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1736.045133] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1736.045325] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Starting heal instance info cache {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1736.045445] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Rebuilding the list of instances to heal {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1736.064825] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1736.065030] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1736.065114] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1736.065236] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1736.065399] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1736.065516] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 1a375c37-fcec-4442-827d-103352e81035] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1736.065640] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1736.065757] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1736.065874] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1736.065991] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1736.066122] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Didn't find any instances for network info cache update. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1741.062146] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1746.039746] env[59577]: WARNING oslo_vmware.rw_handles [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1746.039746] env[59577]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1746.039746] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1746.039746] env[59577]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1746.039746] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1746.039746] env[59577]: ERROR oslo_vmware.rw_handles response.begin() [ 1746.039746] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1746.039746] env[59577]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1746.039746] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1746.039746] env[59577]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1746.039746] env[59577]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1746.039746] env[59577]: ERROR oslo_vmware.rw_handles [ 1746.040389] env[59577]: DEBUG nova.virt.vmwareapi.images [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Downloaded image file data d5e691af-5903-46f6-a589-e220c4e5798c to vmware_temp/f0df05bb-9c8c-434d-826c-442cca14db65/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk on the data store datastore1 {{(pid=59577) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1746.041991] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Caching image {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1746.042289] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Copying Virtual Disk [datastore1] vmware_temp/f0df05bb-9c8c-434d-826c-442cca14db65/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk to [datastore1] vmware_temp/f0df05bb-9c8c-434d-826c-442cca14db65/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk {{(pid=59577) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1746.042605] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-19dee520-b294-4623-80f1-473f3b1a2ad1 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1746.051376] env[59577]: DEBUG oslo_vmware.api [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Waiting for the task: (returnval){ [ 1746.051376] env[59577]: value = "task-1933784" [ 1746.051376] env[59577]: _type = "Task" [ 1746.051376] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1746.060655] env[59577]: DEBUG oslo_vmware.api [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Task: {'id': task-1933784, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1746.562009] env[59577]: DEBUG oslo_vmware.exceptions [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Fault InvalidArgument not matched. {{(pid=59577) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1746.562280] env[59577]: DEBUG oslo_concurrency.lockutils [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1746.562851] env[59577]: ERROR nova.compute.manager [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1746.562851] env[59577]: Faults: ['InvalidArgument'] [ 1746.562851] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Traceback (most recent call last): [ 1746.562851] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1746.562851] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] yield resources [ 1746.562851] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1746.562851] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] self.driver.spawn(context, instance, image_meta, [ 1746.562851] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1746.562851] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1746.562851] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1746.562851] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] self._fetch_image_if_missing(context, vi) [ 1746.562851] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1746.563279] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] image_cache(vi, tmp_image_ds_loc) [ 1746.563279] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1746.563279] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] vm_util.copy_virtual_disk( [ 1746.563279] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1746.563279] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] session._wait_for_task(vmdk_copy_task) [ 1746.563279] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1746.563279] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] return self.wait_for_task(task_ref) [ 1746.563279] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1746.563279] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] return evt.wait() [ 1746.563279] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1746.563279] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] result = hub.switch() [ 1746.563279] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1746.563279] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] return self.greenlet.switch() [ 1746.563681] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1746.563681] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] self.f(*self.args, **self.kw) [ 1746.563681] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1746.563681] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] raise exceptions.translate_fault(task_info.error) [ 1746.563681] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1746.563681] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Faults: ['InvalidArgument'] [ 1746.563681] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] [ 1746.563681] env[59577]: INFO nova.compute.manager [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Terminating instance [ 1746.564888] env[59577]: DEBUG oslo_concurrency.lockutils [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1746.565244] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1746.565479] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ad0c8103-1994-4877-a700-e060a0aad1e3 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1746.568478] env[59577]: DEBUG nova.compute.manager [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Start destroying the instance on the hypervisor. {{(pid=59577) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1746.568730] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Destroying instance {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1746.569750] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76e5a9a3-1f46-41f3-843a-0b14940d24fc {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1746.576795] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Unregistering the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1746.577033] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-58e6564c-2c9b-49f5-9f5b-5617adbd0fae {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1746.579139] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1746.579306] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59577) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1746.580245] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-297b7049-5ea0-4a74-a8cf-94549c7fdcc0 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1746.584683] env[59577]: DEBUG oslo_vmware.api [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Waiting for the task: (returnval){ [ 1746.584683] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]52b2420c-8217-8684-ec7c-8ecfc51f5ff3" [ 1746.584683] env[59577]: _type = "Task" [ 1746.584683] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1746.591716] env[59577]: DEBUG oslo_vmware.api [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]52b2420c-8217-8684-ec7c-8ecfc51f5ff3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1746.666376] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Unregistered the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1746.666557] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Deleting contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1746.666788] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Deleting the datastore file [datastore1] 6855f7a9-0dc0-41e1-900b-112181064d7d {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1746.667105] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-0d2f1029-2aab-443c-b63f-1d4c279ab98a {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1746.672869] env[59577]: DEBUG oslo_vmware.api [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Waiting for the task: (returnval){ [ 1746.672869] env[59577]: value = "task-1933786" [ 1746.672869] env[59577]: _type = "Task" [ 1746.672869] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1746.681708] env[59577]: DEBUG oslo_vmware.api [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Task: {'id': task-1933786, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1747.096579] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Preparing fetch location {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1747.096579] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Creating directory with path [datastore1] vmware_temp/8a735491-591b-489c-b943-f1d15e5f4f15/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1747.096579] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-18b31cba-3ebc-4f14-8210-41f89db8b168 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1747.106860] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Created directory with path [datastore1] vmware_temp/8a735491-591b-489c-b943-f1d15e5f4f15/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1747.107060] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Fetch image to [datastore1] vmware_temp/8a735491-591b-489c-b943-f1d15e5f4f15/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1747.107231] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to [datastore1] vmware_temp/8a735491-591b-489c-b943-f1d15e5f4f15/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk on the data store datastore1 {{(pid=59577) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1747.107930] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7bafc76-32ac-46ab-a638-c5352addfe00 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1747.114425] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9d05d13-31c1-4877-aeb7-efa2e644896a {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1747.123308] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f0d0f40-afa4-4777-b167-614a27a6628c {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1747.153739] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50d2d12c-1692-435d-94a3-5076c82e10d0 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1747.158926] env[59577]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7f945b65-ea1f-4da0-bcaa-5d9799402e1b {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1747.181682] env[59577]: DEBUG oslo_vmware.api [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Task: {'id': task-1933786, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.078655} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1747.181914] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Deleted the datastore file {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1747.182104] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Deleted contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1747.182277] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Instance destroyed {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1747.182603] env[59577]: INFO nova.compute.manager [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1747.184847] env[59577]: DEBUG nova.compute.claims [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Aborting claim: {{(pid=59577) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1747.185024] env[59577]: DEBUG oslo_concurrency.lockutils [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1747.185241] env[59577]: DEBUG oslo_concurrency.lockutils [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1747.248390] env[59577]: DEBUG nova.virt.vmwareapi.images [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to the data store datastore1 {{(pid=59577) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1747.302372] env[59577]: DEBUG oslo_vmware.rw_handles [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8a735491-591b-489c-b943-f1d15e5f4f15/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59577) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1747.360757] env[59577]: DEBUG oslo_vmware.rw_handles [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Completed reading data from the image iterator. {{(pid=59577) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1747.360942] env[59577]: DEBUG oslo_vmware.rw_handles [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8a735491-591b-489c-b943-f1d15e5f4f15/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59577) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1747.515160] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c582764a-bcd5-4864-a443-29b44c7df0f7 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1747.522917] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7241354e-2ee5-4066-b640-aaf4fe75c0d7 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1747.552754] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6218bf8f-1d17-4bb8-8a8b-b2f55bd6072a {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1747.560203] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81209d6b-a042-44ed-b71e-fae8c0d06d09 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1747.573293] env[59577]: DEBUG nova.compute.provider_tree [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1747.581387] env[59577]: DEBUG nova.scheduler.client.report [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1747.594052] env[59577]: DEBUG oslo_concurrency.lockutils [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.409s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1747.594601] env[59577]: ERROR nova.compute.manager [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1747.594601] env[59577]: Faults: ['InvalidArgument'] [ 1747.594601] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Traceback (most recent call last): [ 1747.594601] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1747.594601] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] self.driver.spawn(context, instance, image_meta, [ 1747.594601] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1747.594601] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1747.594601] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1747.594601] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] self._fetch_image_if_missing(context, vi) [ 1747.594601] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1747.594601] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] image_cache(vi, tmp_image_ds_loc) [ 1747.594601] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1747.594964] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] vm_util.copy_virtual_disk( [ 1747.594964] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1747.594964] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] session._wait_for_task(vmdk_copy_task) [ 1747.594964] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1747.594964] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] return self.wait_for_task(task_ref) [ 1747.594964] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1747.594964] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] return evt.wait() [ 1747.594964] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1747.594964] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] result = hub.switch() [ 1747.594964] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1747.594964] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] return self.greenlet.switch() [ 1747.594964] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1747.594964] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] self.f(*self.args, **self.kw) [ 1747.595342] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1747.595342] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] raise exceptions.translate_fault(task_info.error) [ 1747.595342] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1747.595342] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Faults: ['InvalidArgument'] [ 1747.595342] env[59577]: ERROR nova.compute.manager [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] [ 1747.595342] env[59577]: DEBUG nova.compute.utils [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] VimFaultException {{(pid=59577) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1747.596585] env[59577]: DEBUG nova.compute.manager [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Build of instance 6855f7a9-0dc0-41e1-900b-112181064d7d was re-scheduled: A specified parameter was not correct: fileType [ 1747.596585] env[59577]: Faults: ['InvalidArgument'] {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1747.596947] env[59577]: DEBUG nova.compute.manager [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Unplugging VIFs for instance {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1747.597126] env[59577]: DEBUG nova.compute.manager [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1747.597291] env[59577]: DEBUG nova.compute.manager [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Deallocating network for instance {{(pid=59577) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1747.597449] env[59577]: DEBUG nova.network.neutron [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] deallocate_for_instance() {{(pid=59577) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1747.920219] env[59577]: DEBUG nova.network.neutron [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Updating instance_info_cache with network_info: [] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1747.935082] env[59577]: INFO nova.compute.manager [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: 6855f7a9-0dc0-41e1-900b-112181064d7d] Took 0.34 seconds to deallocate network for instance. [ 1748.027437] env[59577]: INFO nova.scheduler.client.report [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Deleted allocations for instance 6855f7a9-0dc0-41e1-900b-112181064d7d [ 1748.044755] env[59577]: DEBUG oslo_concurrency.lockutils [None req-aa6a649a-b93c-410b-99f4-825decd3a867 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Lock "6855f7a9-0dc0-41e1-900b-112181064d7d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 137.488s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1748.067846] env[59577]: DEBUG nova.compute.manager [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Starting instance... {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1748.123999] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1748.124266] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1748.125814] env[59577]: INFO nova.compute.claims [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1748.428221] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e61ad090-62ab-427d-83db-aa4a0ee2cda7 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1748.435611] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-132aca71-45f0-40fb-b12e-a6f693ff6dfe {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1748.465883] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97020154-d232-4da7-baaa-6a90f0d8cf3d {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1748.472692] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a672486-f225-4bfe-9953-86567737f0f6 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1748.485412] env[59577]: DEBUG nova.compute.provider_tree [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1748.494904] env[59577]: DEBUG nova.scheduler.client.report [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1748.506335] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.382s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1748.506791] env[59577]: DEBUG nova.compute.manager [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Start building networks asynchronously for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1748.537964] env[59577]: DEBUG nova.compute.utils [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Using /dev/sd instead of None {{(pid=59577) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1748.539148] env[59577]: DEBUG nova.compute.manager [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Allocating IP information in the background. {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1748.539320] env[59577]: DEBUG nova.network.neutron [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] allocate_for_instance() {{(pid=59577) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1748.547272] env[59577]: DEBUG nova.compute.manager [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Start building block device mappings for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1748.607138] env[59577]: DEBUG nova.policy [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '17f8660c9c09492bba48e650519012cf', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8a8bdbc34699435bbde5622db4df613f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59577) authorize /opt/stack/nova/nova/policy.py:203}} [ 1748.610599] env[59577]: DEBUG nova.compute.manager [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Start spawning the instance on the hypervisor. {{(pid=59577) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1748.634032] env[59577]: DEBUG nova.virt.hardware [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-12-17T13:42:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-12-17T13:42:33Z,direct_url=,disk_format='vmdk',id=d5e691af-5903-46f6-a589-e220c4e5798c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5cabf3c5743c484db7095e0ffc0e5d73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-12-17T13:42:34Z,virtual_size=,visibility=), allow threads: False {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1748.634286] env[59577]: DEBUG nova.virt.hardware [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Flavor limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1748.634440] env[59577]: DEBUG nova.virt.hardware [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Image limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1748.634649] env[59577]: DEBUG nova.virt.hardware [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Flavor pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1748.634803] env[59577]: DEBUG nova.virt.hardware [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Image pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1748.634950] env[59577]: DEBUG nova.virt.hardware [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1748.635231] env[59577]: DEBUG nova.virt.hardware [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1748.635399] env[59577]: DEBUG nova.virt.hardware [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1748.638401] env[59577]: DEBUG nova.virt.hardware [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Got 1 possible topologies {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1748.638607] env[59577]: DEBUG nova.virt.hardware [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1748.638794] env[59577]: DEBUG nova.virt.hardware [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1748.639645] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-560ccfe6-3caa-4550-99c1-5e58c8dc3f69 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1748.647570] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aeadfe62-83ab-4e9a-8ea5-47dcff6a2d93 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1749.037769] env[59577]: DEBUG nova.network.neutron [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Successfully created port: 441a0cf4-0c88-4060-9011-1de3673f29ba {{(pid=59577) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1749.069779] env[59577]: DEBUG oslo_concurrency.lockutils [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Acquiring lock "aac8eec6-577b-46d2-9baa-8cf548a6970e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1749.069779] env[59577]: DEBUG oslo_concurrency.lockutils [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Lock "aac8eec6-577b-46d2-9baa-8cf548a6970e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1749.876289] env[59577]: DEBUG nova.network.neutron [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Successfully updated port: 441a0cf4-0c88-4060-9011-1de3673f29ba {{(pid=59577) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1749.893248] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Acquiring lock "refresh_cache-b8002da2-eecd-490a-a34b-c651c28c57fc" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1749.893248] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Acquired lock "refresh_cache-b8002da2-eecd-490a-a34b-c651c28c57fc" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1749.893248] env[59577]: DEBUG nova.network.neutron [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Building network info cache for instance {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1749.945249] env[59577]: DEBUG nova.network.neutron [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Instance cache missing network info. {{(pid=59577) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1750.055806] env[59577]: DEBUG nova.compute.manager [req-759305d7-a8f9-40ae-9116-78903b33ff23 req-1dc1512f-ac8b-42bf-844d-5bb75b9982de service nova] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Received event network-vif-plugged-441a0cf4-0c88-4060-9011-1de3673f29ba {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1750.056031] env[59577]: DEBUG oslo_concurrency.lockutils [req-759305d7-a8f9-40ae-9116-78903b33ff23 req-1dc1512f-ac8b-42bf-844d-5bb75b9982de service nova] Acquiring lock "b8002da2-eecd-490a-a34b-c651c28c57fc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1750.056675] env[59577]: DEBUG oslo_concurrency.lockutils [req-759305d7-a8f9-40ae-9116-78903b33ff23 req-1dc1512f-ac8b-42bf-844d-5bb75b9982de service nova] Lock "b8002da2-eecd-490a-a34b-c651c28c57fc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1750.056675] env[59577]: DEBUG oslo_concurrency.lockutils [req-759305d7-a8f9-40ae-9116-78903b33ff23 req-1dc1512f-ac8b-42bf-844d-5bb75b9982de service nova] Lock "b8002da2-eecd-490a-a34b-c651c28c57fc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1750.056675] env[59577]: DEBUG nova.compute.manager [req-759305d7-a8f9-40ae-9116-78903b33ff23 req-1dc1512f-ac8b-42bf-844d-5bb75b9982de service nova] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] No waiting events found dispatching network-vif-plugged-441a0cf4-0c88-4060-9011-1de3673f29ba {{(pid=59577) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1750.056675] env[59577]: WARNING nova.compute.manager [req-759305d7-a8f9-40ae-9116-78903b33ff23 req-1dc1512f-ac8b-42bf-844d-5bb75b9982de service nova] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Received unexpected event network-vif-plugged-441a0cf4-0c88-4060-9011-1de3673f29ba for instance with vm_state building and task_state spawning. [ 1750.056875] env[59577]: DEBUG nova.compute.manager [req-759305d7-a8f9-40ae-9116-78903b33ff23 req-1dc1512f-ac8b-42bf-844d-5bb75b9982de service nova] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Received event network-changed-441a0cf4-0c88-4060-9011-1de3673f29ba {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1750.056966] env[59577]: DEBUG nova.compute.manager [req-759305d7-a8f9-40ae-9116-78903b33ff23 req-1dc1512f-ac8b-42bf-844d-5bb75b9982de service nova] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Refreshing instance network info cache due to event network-changed-441a0cf4-0c88-4060-9011-1de3673f29ba. {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1750.057422] env[59577]: DEBUG oslo_concurrency.lockutils [req-759305d7-a8f9-40ae-9116-78903b33ff23 req-1dc1512f-ac8b-42bf-844d-5bb75b9982de service nova] Acquiring lock "refresh_cache-b8002da2-eecd-490a-a34b-c651c28c57fc" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1750.174387] env[59577]: DEBUG nova.network.neutron [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Updating instance_info_cache with network_info: [{"id": "441a0cf4-0c88-4060-9011-1de3673f29ba", "address": "fa:16:3e:cf:b0:42", "network": {"id": "a61ea66b-b7b2-4c86-976d-641129187c28", "bridge": "br-int", "label": "tempest-ServersTestJSON-427195933-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8a8bdbc34699435bbde5622db4df613f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d6a6f7bb-6106-4cfd-9aef-b85628d0cefa", "external-id": "nsx-vlan-transportzone-194", "segmentation_id": 194, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap441a0cf4-0c", "ovs_interfaceid": "441a0cf4-0c88-4060-9011-1de3673f29ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1750.184993] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Releasing lock "refresh_cache-b8002da2-eecd-490a-a34b-c651c28c57fc" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1750.185298] env[59577]: DEBUG nova.compute.manager [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Instance network_info: |[{"id": "441a0cf4-0c88-4060-9011-1de3673f29ba", "address": "fa:16:3e:cf:b0:42", "network": {"id": "a61ea66b-b7b2-4c86-976d-641129187c28", "bridge": "br-int", "label": "tempest-ServersTestJSON-427195933-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8a8bdbc34699435bbde5622db4df613f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d6a6f7bb-6106-4cfd-9aef-b85628d0cefa", "external-id": "nsx-vlan-transportzone-194", "segmentation_id": 194, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap441a0cf4-0c", "ovs_interfaceid": "441a0cf4-0c88-4060-9011-1de3673f29ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1750.185922] env[59577]: DEBUG oslo_concurrency.lockutils [req-759305d7-a8f9-40ae-9116-78903b33ff23 req-1dc1512f-ac8b-42bf-844d-5bb75b9982de service nova] Acquired lock "refresh_cache-b8002da2-eecd-490a-a34b-c651c28c57fc" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1750.185922] env[59577]: DEBUG nova.network.neutron [req-759305d7-a8f9-40ae-9116-78903b33ff23 req-1dc1512f-ac8b-42bf-844d-5bb75b9982de service nova] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Refreshing network info cache for port 441a0cf4-0c88-4060-9011-1de3673f29ba {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1750.186744] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:cf:b0:42', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd6a6f7bb-6106-4cfd-9aef-b85628d0cefa', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '441a0cf4-0c88-4060-9011-1de3673f29ba', 'vif_model': 'vmxnet3'}] {{(pid=59577) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1750.194063] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Creating folder: Project (8a8bdbc34699435bbde5622db4df613f). Parent ref: group-v398749. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1750.195429] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-dc69486d-631a-45cb-9c5e-73ba461436c1 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1750.208161] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Created folder: Project (8a8bdbc34699435bbde5622db4df613f) in parent group-v398749. [ 1750.208346] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Creating folder: Instances. Parent ref: group-v398785. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1750.208560] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6d7f67d9-26cd-4201-8915-5fd35c80333c {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1750.217121] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Created folder: Instances in parent group-v398785. [ 1750.217338] env[59577]: DEBUG oslo.service.loopingcall [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59577) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1750.217507] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Creating VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1750.217686] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-20a334e9-730e-427c-924f-e3a56915db22 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1750.235184] env[59577]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1750.235184] env[59577]: value = "task-1933789" [ 1750.235184] env[59577]: _type = "Task" [ 1750.235184] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1750.245184] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933789, 'name': CreateVM_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1750.523818] env[59577]: DEBUG nova.network.neutron [req-759305d7-a8f9-40ae-9116-78903b33ff23 req-1dc1512f-ac8b-42bf-844d-5bb75b9982de service nova] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Updated VIF entry in instance network info cache for port 441a0cf4-0c88-4060-9011-1de3673f29ba. {{(pid=59577) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1750.524436] env[59577]: DEBUG nova.network.neutron [req-759305d7-a8f9-40ae-9116-78903b33ff23 req-1dc1512f-ac8b-42bf-844d-5bb75b9982de service nova] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Updating instance_info_cache with network_info: [{"id": "441a0cf4-0c88-4060-9011-1de3673f29ba", "address": "fa:16:3e:cf:b0:42", "network": {"id": "a61ea66b-b7b2-4c86-976d-641129187c28", "bridge": "br-int", "label": "tempest-ServersTestJSON-427195933-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8a8bdbc34699435bbde5622db4df613f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d6a6f7bb-6106-4cfd-9aef-b85628d0cefa", "external-id": "nsx-vlan-transportzone-194", "segmentation_id": 194, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap441a0cf4-0c", "ovs_interfaceid": "441a0cf4-0c88-4060-9011-1de3673f29ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1750.533549] env[59577]: DEBUG oslo_concurrency.lockutils [req-759305d7-a8f9-40ae-9116-78903b33ff23 req-1dc1512f-ac8b-42bf-844d-5bb75b9982de service nova] Releasing lock "refresh_cache-b8002da2-eecd-490a-a34b-c651c28c57fc" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1750.748442] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933789, 'name': CreateVM_Task, 'duration_secs': 0.34248} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1750.748442] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Created VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1750.749098] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1750.749268] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1750.749583] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1750.749879] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cf87ca37-3ffa-418b-abd7-4af89d366e5e {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1750.754446] env[59577]: DEBUG oslo_vmware.api [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Waiting for the task: (returnval){ [ 1750.754446] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]52df8577-291d-f49c-01eb-9c0081f62d47" [ 1750.754446] env[59577]: _type = "Task" [ 1750.754446] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1750.765825] env[59577]: DEBUG oslo_vmware.api [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]52df8577-291d-f49c-01eb-9c0081f62d47, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1751.264466] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1751.264800] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Processing image d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1751.265036] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1786.045272] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1787.045728] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1788.045267] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1792.045651] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1792.045939] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59577) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1793.040509] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1793.081709] env[59577]: WARNING oslo_vmware.rw_handles [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1793.081709] env[59577]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1793.081709] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1793.081709] env[59577]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1793.081709] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1793.081709] env[59577]: ERROR oslo_vmware.rw_handles response.begin() [ 1793.081709] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1793.081709] env[59577]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1793.081709] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1793.081709] env[59577]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1793.081709] env[59577]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1793.081709] env[59577]: ERROR oslo_vmware.rw_handles [ 1793.082481] env[59577]: DEBUG nova.virt.vmwareapi.images [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Downloaded image file data d5e691af-5903-46f6-a589-e220c4e5798c to vmware_temp/8a735491-591b-489c-b943-f1d15e5f4f15/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk on the data store datastore1 {{(pid=59577) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1793.084298] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Caching image {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1793.084552] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Copying Virtual Disk [datastore1] vmware_temp/8a735491-591b-489c-b943-f1d15e5f4f15/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk to [datastore1] vmware_temp/8a735491-591b-489c-b943-f1d15e5f4f15/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk {{(pid=59577) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1793.084824] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-1d6691e5-eb18-4cbc-a7a9-4acf01d5b8c9 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1793.092996] env[59577]: DEBUG oslo_vmware.api [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Waiting for the task: (returnval){ [ 1793.092996] env[59577]: value = "task-1933790" [ 1793.092996] env[59577]: _type = "Task" [ 1793.092996] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1793.103034] env[59577]: DEBUG oslo_vmware.api [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Task: {'id': task-1933790, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1793.603779] env[59577]: DEBUG oslo_vmware.exceptions [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Fault InvalidArgument not matched. {{(pid=59577) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1793.604035] env[59577]: DEBUG oslo_concurrency.lockutils [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1793.604592] env[59577]: ERROR nova.compute.manager [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1793.604592] env[59577]: Faults: ['InvalidArgument'] [ 1793.604592] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Traceback (most recent call last): [ 1793.604592] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1793.604592] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] yield resources [ 1793.604592] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1793.604592] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] self.driver.spawn(context, instance, image_meta, [ 1793.604592] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1793.604592] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1793.604592] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1793.604592] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] self._fetch_image_if_missing(context, vi) [ 1793.604592] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1793.605022] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] image_cache(vi, tmp_image_ds_loc) [ 1793.605022] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1793.605022] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] vm_util.copy_virtual_disk( [ 1793.605022] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1793.605022] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] session._wait_for_task(vmdk_copy_task) [ 1793.605022] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1793.605022] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] return self.wait_for_task(task_ref) [ 1793.605022] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1793.605022] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] return evt.wait() [ 1793.605022] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1793.605022] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] result = hub.switch() [ 1793.605022] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1793.605022] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] return self.greenlet.switch() [ 1793.605487] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1793.605487] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] self.f(*self.args, **self.kw) [ 1793.605487] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1793.605487] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] raise exceptions.translate_fault(task_info.error) [ 1793.605487] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1793.605487] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Faults: ['InvalidArgument'] [ 1793.605487] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] [ 1793.605487] env[59577]: INFO nova.compute.manager [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Terminating instance [ 1793.606508] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1793.606729] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1793.606966] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3310e49c-2ccc-4393-8d42-e995c2e3befc {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1793.609153] env[59577]: DEBUG nova.compute.manager [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Start destroying the instance on the hypervisor. {{(pid=59577) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1793.609346] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Destroying instance {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1793.610074] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4421b00c-3379-4477-8e7f-b41f51e4bfb2 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1793.617259] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Unregistering the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1793.617477] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-21a7da12-518e-4031-8b74-9bd80b45cfdc {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1793.619715] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1793.619889] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59577) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1793.620817] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5ca9b8bb-623b-4755-9fd6-4bb35446b726 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1793.625431] env[59577]: DEBUG oslo_vmware.api [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Waiting for the task: (returnval){ [ 1793.625431] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]522b7611-aa22-d9e5-7f98-23a5db6fc911" [ 1793.625431] env[59577]: _type = "Task" [ 1793.625431] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1793.632543] env[59577]: DEBUG oslo_vmware.api [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]522b7611-aa22-d9e5-7f98-23a5db6fc911, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1793.699626] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Unregistered the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1793.699853] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Deleting contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1793.700051] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Deleting the datastore file [datastore1] 145736d0-0f14-4ec3-ada2-ddb1fe6f271a {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1793.700330] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8097fb60-1d7d-4405-8aa6-7202441cca18 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1793.706455] env[59577]: DEBUG oslo_vmware.api [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Waiting for the task: (returnval){ [ 1793.706455] env[59577]: value = "task-1933792" [ 1793.706455] env[59577]: _type = "Task" [ 1793.706455] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1793.713798] env[59577]: DEBUG oslo_vmware.api [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Task: {'id': task-1933792, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1794.044175] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1794.044422] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1794.054530] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1794.054743] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1794.054904] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1794.055071] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59577) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1794.056281] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-055ca7c4-a6a6-4d44-8048-b54e4ad357c2 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1794.064634] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8521c3c-0f11-4559-844d-5d91d8b1d563 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1794.078436] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8dc6fb64-1b60-4e5d-9fc1-8aed7d3413f5 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1794.084902] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd6f4555-197e-4ac5-ad1b-b4a04fe09007 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1794.115923] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181319MB free_disk=175GB free_vcpus=48 pci_devices=None {{(pid=59577) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1794.116088] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1794.116283] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1794.134768] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Preparing fetch location {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1794.135016] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Creating directory with path [datastore1] vmware_temp/72d2580a-2330-46ab-ba4c-8935b06f1651/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1794.135236] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-077b47d0-8f19-4c65-9ac1-a6a029e2aa28 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1794.146548] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Created directory with path [datastore1] vmware_temp/72d2580a-2330-46ab-ba4c-8935b06f1651/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1794.146747] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Fetch image to [datastore1] vmware_temp/72d2580a-2330-46ab-ba4c-8935b06f1651/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1794.146912] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to [datastore1] vmware_temp/72d2580a-2330-46ab-ba4c-8935b06f1651/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk on the data store datastore1 {{(pid=59577) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1794.147634] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-100c1288-272e-4344-b824-a838ae4f1458 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1794.161551] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53a19ebe-6686-49e1-9d8a-2fcc0e3d28a1 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1794.173317] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-112bad53-9620-4cf0-b937-744a72a80189 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1794.204368] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 145736d0-0f14-4ec3-ada2-ddb1fe6f271a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1794.204525] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance cc3276aa-0d5a-4a14-ae90-e20a1b823bd3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1794.204653] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance ee50624e-74d6-4afc-9fba-c541f1b83554 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1794.204776] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance e9b9f5db-afac-494e-9850-c0d82f26fc68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1794.204896] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 1a375c37-fcec-4442-827d-103352e81035 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1794.205026] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 47aac36c-3f70-40a8-ab60-cebba86d3f85 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1794.205149] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance b9d0daac-02e6-4862-b3de-64223d5a4a76 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1794.205264] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance e7945a83-b063-42c4-9991-7f1e0545361d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1794.205379] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 1ebb8847-1932-4ed6-8e56-bf48952cfc9c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1794.205534] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance b8002da2-eecd-490a-a34b-c651c28c57fc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1794.207190] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11de039e-c462-46ab-93f6-24be06f2cac5 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1794.219743] env[59577]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-2c5933a2-2aab-466b-ab71-e584b683b8fa {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1794.221750] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 077b8c8d-ee7e-495b-a7f7-676fe7c70f83 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1794.223026] env[59577]: DEBUG oslo_vmware.api [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Task: {'id': task-1933792, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069528} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1794.223420] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Deleted the datastore file {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1794.223625] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Deleted contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1794.223802] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Instance destroyed {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1794.223974] env[59577]: INFO nova.compute.manager [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1794.226320] env[59577]: DEBUG nova.compute.claims [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Aborting claim: {{(pid=59577) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1794.226496] env[59577]: DEBUG oslo_concurrency.lockutils [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1794.232186] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 640c1048-dca1-4fbd-889b-cd8aa23eb3f2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1794.242156] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance d25dbbad-94bb-4147-aa56-91aaab4ed077 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1794.252445] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 7b8c78af-c9f9-4bff-a667-6faa0fb7c482 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1794.262761] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 16a36155-12ef-40b2-b94d-db4619ac2f4b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1794.276656] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance cefa22cd-9d7a-4b86-9ec5-bb9f005b42e0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1794.287008] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance eaa781b6-6542-495d-8430-73416444d972 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1794.296512] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 8ed18ae2-2ba1-424c-b695-846afd7b3501 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1794.306673] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 659b62e3-0bb2-46fb-aa6a-faef4883bdc1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1794.311471] env[59577]: DEBUG nova.virt.vmwareapi.images [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to the data store datastore1 {{(pid=59577) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1794.318724] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance bc569bae-c099-4437-8683-5af75ef5f106 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1794.328828] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 86651832-02db-4181-8a9e-11da5f017f65 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1794.338895] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance aac8eec6-577b-46d2-9baa-8cf548a6970e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1794.339140] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1794.339290] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1794.363491] env[59577]: DEBUG oslo_vmware.rw_handles [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/72d2580a-2330-46ab-ba4c-8935b06f1651/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59577) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1794.420359] env[59577]: DEBUG oslo_vmware.rw_handles [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Completed reading data from the image iterator. {{(pid=59577) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1794.420543] env[59577]: DEBUG oslo_vmware.rw_handles [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/72d2580a-2330-46ab-ba4c-8935b06f1651/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59577) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1794.644685] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26e61486-1426-48db-9cd9-8b66ea2c2c66 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1794.652206] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa5b2e9c-0c55-4a57-8677-1e38292bb5be {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1794.680976] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c23e458a-a232-4d43-87f0-de0d8397201e {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1794.687676] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c82ed60-f92c-4df7-b55d-b19fe7845750 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1794.699977] env[59577]: DEBUG nova.compute.provider_tree [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1794.708456] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1794.723206] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59577) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1794.723206] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.605s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1794.723206] env[59577]: DEBUG oslo_concurrency.lockutils [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.495s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1794.998797] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d90eaca-7140-4d0b-a4e4-54b303cddfc8 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1795.006238] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c22a2769-aeb0-4f61-b861-a526fc663584 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1795.035706] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06e1212b-578e-4c7b-a5a4-91a4d9766dc6 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1795.042586] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e19e412-c9b0-437b-89b3-978cf633874f {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1795.054955] env[59577]: DEBUG nova.compute.provider_tree [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1795.063236] env[59577]: DEBUG nova.scheduler.client.report [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1795.078247] env[59577]: DEBUG oslo_concurrency.lockutils [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.356s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1795.078767] env[59577]: ERROR nova.compute.manager [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1795.078767] env[59577]: Faults: ['InvalidArgument'] [ 1795.078767] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Traceback (most recent call last): [ 1795.078767] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1795.078767] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] self.driver.spawn(context, instance, image_meta, [ 1795.078767] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1795.078767] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1795.078767] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1795.078767] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] self._fetch_image_if_missing(context, vi) [ 1795.078767] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1795.078767] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] image_cache(vi, tmp_image_ds_loc) [ 1795.078767] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1795.079113] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] vm_util.copy_virtual_disk( [ 1795.079113] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1795.079113] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] session._wait_for_task(vmdk_copy_task) [ 1795.079113] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1795.079113] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] return self.wait_for_task(task_ref) [ 1795.079113] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1795.079113] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] return evt.wait() [ 1795.079113] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1795.079113] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] result = hub.switch() [ 1795.079113] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1795.079113] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] return self.greenlet.switch() [ 1795.079113] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1795.079113] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] self.f(*self.args, **self.kw) [ 1795.079599] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1795.079599] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] raise exceptions.translate_fault(task_info.error) [ 1795.079599] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1795.079599] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Faults: ['InvalidArgument'] [ 1795.079599] env[59577]: ERROR nova.compute.manager [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] [ 1795.079599] env[59577]: DEBUG nova.compute.utils [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] VimFaultException {{(pid=59577) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1795.080796] env[59577]: DEBUG nova.compute.manager [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Build of instance 145736d0-0f14-4ec3-ada2-ddb1fe6f271a was re-scheduled: A specified parameter was not correct: fileType [ 1795.080796] env[59577]: Faults: ['InvalidArgument'] {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1795.081203] env[59577]: DEBUG nova.compute.manager [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Unplugging VIFs for instance {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1795.081376] env[59577]: DEBUG nova.compute.manager [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1795.081529] env[59577]: DEBUG nova.compute.manager [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Deallocating network for instance {{(pid=59577) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1795.081691] env[59577]: DEBUG nova.network.neutron [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] deallocate_for_instance() {{(pid=59577) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1795.382043] env[59577]: DEBUG nova.network.neutron [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Updating instance_info_cache with network_info: [] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1795.392223] env[59577]: INFO nova.compute.manager [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] [instance: 145736d0-0f14-4ec3-ada2-ddb1fe6f271a] Took 0.31 seconds to deallocate network for instance. [ 1795.512109] env[59577]: INFO nova.scheduler.client.report [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Deleted allocations for instance 145736d0-0f14-4ec3-ada2-ddb1fe6f271a [ 1795.529406] env[59577]: DEBUG oslo_concurrency.lockutils [None req-c2cf3bf9-1c80-4741-aa36-7b67e98e0232 tempest-MigrationsAdminTest-739812037 tempest-MigrationsAdminTest-739812037-project-member] Lock "145736d0-0f14-4ec3-ada2-ddb1fe6f271a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 182.619s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1795.545052] env[59577]: DEBUG nova.compute.manager [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Starting instance... {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1795.605194] env[59577]: DEBUG oslo_concurrency.lockutils [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1795.605194] env[59577]: DEBUG oslo_concurrency.lockutils [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1795.611144] env[59577]: INFO nova.compute.claims [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1795.727856] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1795.900843] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a22ce17c-e961-42e7-98e4-d64b6407eed9 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1795.908894] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-695c2d86-7dbd-4e64-b90c-2eae6cd38020 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1795.945565] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e3c6b54-0b9e-442f-a070-522b54944b21 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1795.953150] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6def26d0-b772-49e6-a96b-4587a011fb49 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1795.966154] env[59577]: DEBUG nova.compute.provider_tree [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1795.974692] env[59577]: DEBUG nova.scheduler.client.report [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1795.991191] env[59577]: DEBUG oslo_concurrency.lockutils [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.386s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1795.991688] env[59577]: DEBUG nova.compute.manager [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Start building networks asynchronously for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1796.023774] env[59577]: DEBUG nova.compute.utils [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Using /dev/sd instead of None {{(pid=59577) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1796.025523] env[59577]: DEBUG nova.compute.manager [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Allocating IP information in the background. {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1796.025523] env[59577]: DEBUG nova.network.neutron [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] allocate_for_instance() {{(pid=59577) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1796.034617] env[59577]: DEBUG nova.compute.manager [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Start building block device mappings for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1796.044429] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1796.044591] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Starting heal instance info cache {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1796.044715] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Rebuilding the list of instances to heal {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1796.071260] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1796.071260] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1796.071371] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1796.071497] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 1a375c37-fcec-4442-827d-103352e81035] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1796.071627] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1796.072248] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1796.072397] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1796.072522] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1796.072641] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1796.072755] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1796.072874] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Didn't find any instances for network info cache update. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1796.107436] env[59577]: DEBUG nova.compute.manager [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Start spawning the instance on the hypervisor. {{(pid=59577) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1796.112836] env[59577]: DEBUG nova.policy [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '129ac5235de0454cbba850f5f21c2d26', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '41001df4ca2d4ba5b13f377c4ef88d5a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59577) authorize /opt/stack/nova/nova/policy.py:203}} [ 1796.133097] env[59577]: DEBUG nova.virt.hardware [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-12-17T13:42:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-12-17T13:42:33Z,direct_url=,disk_format='vmdk',id=d5e691af-5903-46f6-a589-e220c4e5798c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5cabf3c5743c484db7095e0ffc0e5d73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-12-17T13:42:34Z,virtual_size=,visibility=), allow threads: False {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1796.133344] env[59577]: DEBUG nova.virt.hardware [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Flavor limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1796.133538] env[59577]: DEBUG nova.virt.hardware [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Image limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1796.133681] env[59577]: DEBUG nova.virt.hardware [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Flavor pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1796.133840] env[59577]: DEBUG nova.virt.hardware [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Image pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1796.133971] env[59577]: DEBUG nova.virt.hardware [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1796.134274] env[59577]: DEBUG nova.virt.hardware [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1796.134477] env[59577]: DEBUG nova.virt.hardware [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1796.134687] env[59577]: DEBUG nova.virt.hardware [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Got 1 possible topologies {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1796.134888] env[59577]: DEBUG nova.virt.hardware [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1796.135114] env[59577]: DEBUG nova.virt.hardware [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1796.136020] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b568c6b8-6efa-4db4-b60f-c53b7c21df41 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1796.147018] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5e206c5-be57-43a7-8bb0-c6a3406de255 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1796.501327] env[59577]: DEBUG nova.network.neutron [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Successfully created port: 1765580f-24fe-4393-8a9a-68a9e29a5370 {{(pid=59577) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1797.431053] env[59577]: DEBUG nova.compute.manager [req-3f5c1725-eded-4da0-9902-a603fdc621c5 req-b65c6363-ffc5-4696-8b74-8e3bbcb84f8e service nova] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Received event network-vif-plugged-1765580f-24fe-4393-8a9a-68a9e29a5370 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1797.431300] env[59577]: DEBUG oslo_concurrency.lockutils [req-3f5c1725-eded-4da0-9902-a603fdc621c5 req-b65c6363-ffc5-4696-8b74-8e3bbcb84f8e service nova] Acquiring lock "077b8c8d-ee7e-495b-a7f7-676fe7c70f83-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1797.431508] env[59577]: DEBUG oslo_concurrency.lockutils [req-3f5c1725-eded-4da0-9902-a603fdc621c5 req-b65c6363-ffc5-4696-8b74-8e3bbcb84f8e service nova] Lock "077b8c8d-ee7e-495b-a7f7-676fe7c70f83-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1797.431703] env[59577]: DEBUG oslo_concurrency.lockutils [req-3f5c1725-eded-4da0-9902-a603fdc621c5 req-b65c6363-ffc5-4696-8b74-8e3bbcb84f8e service nova] Lock "077b8c8d-ee7e-495b-a7f7-676fe7c70f83-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1797.431817] env[59577]: DEBUG nova.compute.manager [req-3f5c1725-eded-4da0-9902-a603fdc621c5 req-b65c6363-ffc5-4696-8b74-8e3bbcb84f8e service nova] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] No waiting events found dispatching network-vif-plugged-1765580f-24fe-4393-8a9a-68a9e29a5370 {{(pid=59577) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1797.431990] env[59577]: WARNING nova.compute.manager [req-3f5c1725-eded-4da0-9902-a603fdc621c5 req-b65c6363-ffc5-4696-8b74-8e3bbcb84f8e service nova] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Received unexpected event network-vif-plugged-1765580f-24fe-4393-8a9a-68a9e29a5370 for instance with vm_state building and task_state spawning. [ 1797.449968] env[59577]: DEBUG nova.network.neutron [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Successfully updated port: 1765580f-24fe-4393-8a9a-68a9e29a5370 {{(pid=59577) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1797.462081] env[59577]: DEBUG oslo_concurrency.lockutils [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Acquiring lock "refresh_cache-077b8c8d-ee7e-495b-a7f7-676fe7c70f83" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1797.462081] env[59577]: DEBUG oslo_concurrency.lockutils [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Acquired lock "refresh_cache-077b8c8d-ee7e-495b-a7f7-676fe7c70f83" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1797.462081] env[59577]: DEBUG nova.network.neutron [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Building network info cache for instance {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1797.535689] env[59577]: DEBUG nova.network.neutron [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Instance cache missing network info. {{(pid=59577) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1797.841520] env[59577]: DEBUG nova.network.neutron [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Updating instance_info_cache with network_info: [{"id": "1765580f-24fe-4393-8a9a-68a9e29a5370", "address": "fa:16:3e:27:ae:b4", "network": {"id": "c99152ae-613b-4d0f-a26d-29a62a55a03b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-471148057-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "41001df4ca2d4ba5b13f377c4ef88d5a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "44ed8f45-cb8e-40e7-ac70-a7f386a7d2c2", "external-id": "nsx-vlan-transportzone-268", "segmentation_id": 268, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1765580f-24", "ovs_interfaceid": "1765580f-24fe-4393-8a9a-68a9e29a5370", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1797.855204] env[59577]: DEBUG oslo_concurrency.lockutils [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Releasing lock "refresh_cache-077b8c8d-ee7e-495b-a7f7-676fe7c70f83" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1797.855502] env[59577]: DEBUG nova.compute.manager [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Instance network_info: |[{"id": "1765580f-24fe-4393-8a9a-68a9e29a5370", "address": "fa:16:3e:27:ae:b4", "network": {"id": "c99152ae-613b-4d0f-a26d-29a62a55a03b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-471148057-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "41001df4ca2d4ba5b13f377c4ef88d5a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "44ed8f45-cb8e-40e7-ac70-a7f386a7d2c2", "external-id": "nsx-vlan-transportzone-268", "segmentation_id": 268, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1765580f-24", "ovs_interfaceid": "1765580f-24fe-4393-8a9a-68a9e29a5370", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1797.855872] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:27:ae:b4', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '44ed8f45-cb8e-40e7-ac70-a7f386a7d2c2', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1765580f-24fe-4393-8a9a-68a9e29a5370', 'vif_model': 'vmxnet3'}] {{(pid=59577) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1797.863120] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Creating folder: Project (41001df4ca2d4ba5b13f377c4ef88d5a). Parent ref: group-v398749. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1797.863611] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-631f3cdf-9541-4416-abe9-99906272b889 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1797.873752] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Created folder: Project (41001df4ca2d4ba5b13f377c4ef88d5a) in parent group-v398749. [ 1797.873929] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Creating folder: Instances. Parent ref: group-v398788. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1797.874157] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-de1eb732-0595-479e-9276-d0add6561661 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1797.881920] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Created folder: Instances in parent group-v398788. [ 1797.882149] env[59577]: DEBUG oslo.service.loopingcall [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59577) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1797.882316] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Creating VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1797.882494] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-adfb05ca-e1e9-406b-bbab-cf5915ec9e69 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1797.900574] env[59577]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1797.900574] env[59577]: value = "task-1933795" [ 1797.900574] env[59577]: _type = "Task" [ 1797.900574] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1797.907812] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933795, 'name': CreateVM_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1798.411027] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933795, 'name': CreateVM_Task, 'duration_secs': 0.417998} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1798.411027] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Created VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1798.411574] env[59577]: DEBUG oslo_concurrency.lockutils [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1798.411741] env[59577]: DEBUG oslo_concurrency.lockutils [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1798.412055] env[59577]: DEBUG oslo_concurrency.lockutils [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1798.412404] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-850580e0-61c1-4899-a7ac-bbda1d0484e2 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1798.416678] env[59577]: DEBUG oslo_vmware.api [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Waiting for the task: (returnval){ [ 1798.416678] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]525226fe-0713-e460-dadf-b7b42cd44fb8" [ 1798.416678] env[59577]: _type = "Task" [ 1798.416678] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1798.424274] env[59577]: DEBUG oslo_vmware.api [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]525226fe-0713-e460-dadf-b7b42cd44fb8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1798.928457] env[59577]: DEBUG oslo_concurrency.lockutils [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1798.928732] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Processing image d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1798.928982] env[59577]: DEBUG oslo_concurrency.lockutils [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1799.523778] env[59577]: DEBUG nova.compute.manager [req-4b57d036-372a-48a0-96b6-2efc66c31aea req-6472def9-96f8-4546-af07-4176a428a8c1 service nova] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Received event network-changed-1765580f-24fe-4393-8a9a-68a9e29a5370 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1799.524242] env[59577]: DEBUG nova.compute.manager [req-4b57d036-372a-48a0-96b6-2efc66c31aea req-6472def9-96f8-4546-af07-4176a428a8c1 service nova] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Refreshing instance network info cache due to event network-changed-1765580f-24fe-4393-8a9a-68a9e29a5370. {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1799.524316] env[59577]: DEBUG oslo_concurrency.lockutils [req-4b57d036-372a-48a0-96b6-2efc66c31aea req-6472def9-96f8-4546-af07-4176a428a8c1 service nova] Acquiring lock "refresh_cache-077b8c8d-ee7e-495b-a7f7-676fe7c70f83" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1799.524465] env[59577]: DEBUG oslo_concurrency.lockutils [req-4b57d036-372a-48a0-96b6-2efc66c31aea req-6472def9-96f8-4546-af07-4176a428a8c1 service nova] Acquired lock "refresh_cache-077b8c8d-ee7e-495b-a7f7-676fe7c70f83" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1799.524658] env[59577]: DEBUG nova.network.neutron [req-4b57d036-372a-48a0-96b6-2efc66c31aea req-6472def9-96f8-4546-af07-4176a428a8c1 service nova] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Refreshing network info cache for port 1765580f-24fe-4393-8a9a-68a9e29a5370 {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1799.996259] env[59577]: DEBUG nova.network.neutron [req-4b57d036-372a-48a0-96b6-2efc66c31aea req-6472def9-96f8-4546-af07-4176a428a8c1 service nova] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Updated VIF entry in instance network info cache for port 1765580f-24fe-4393-8a9a-68a9e29a5370. {{(pid=59577) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1799.996644] env[59577]: DEBUG nova.network.neutron [req-4b57d036-372a-48a0-96b6-2efc66c31aea req-6472def9-96f8-4546-af07-4176a428a8c1 service nova] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Updating instance_info_cache with network_info: [{"id": "1765580f-24fe-4393-8a9a-68a9e29a5370", "address": "fa:16:3e:27:ae:b4", "network": {"id": "c99152ae-613b-4d0f-a26d-29a62a55a03b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-471148057-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "41001df4ca2d4ba5b13f377c4ef88d5a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "44ed8f45-cb8e-40e7-ac70-a7f386a7d2c2", "external-id": "nsx-vlan-transportzone-268", "segmentation_id": 268, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1765580f-24", "ovs_interfaceid": "1765580f-24fe-4393-8a9a-68a9e29a5370", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1800.008660] env[59577]: DEBUG oslo_concurrency.lockutils [req-4b57d036-372a-48a0-96b6-2efc66c31aea req-6472def9-96f8-4546-af07-4176a428a8c1 service nova] Releasing lock "refresh_cache-077b8c8d-ee7e-495b-a7f7-676fe7c70f83" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1816.319864] env[59577]: DEBUG nova.compute.manager [req-e0a43a68-beb4-4152-8a26-b72f67a48311 req-f4e20221-1575-4cb6-a720-9197106e94f0 service nova] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Received event network-vif-deleted-760c39d4-271c-4e5c-bcb2-27aa69984700 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1818.807561] env[59577]: DEBUG nova.compute.manager [req-72a88cb3-88ec-4b94-b130-53ebeb839e48 req-dada1448-f0a1-4eb1-8671-1a4c01ca0a0f service nova] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Received event network-vif-deleted-99e66f6f-e73f-444b-852e-36b9125498c3 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1826.901166] env[59577]: DEBUG nova.compute.manager [req-2f8659ea-f98f-426e-b1f2-ad5191566603 req-cd1c01a3-cf43-45b7-9412-30d56a1c1e93 service nova] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Received event network-vif-deleted-c73f9fa7-46ad-4abb-b15a-e9ae9c7fe9d3 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1829.198416] env[59577]: DEBUG nova.compute.manager [req-a900ed4e-2284-4d46-8416-59b99b6955e8 req-07e94265-83c8-4e01-8676-3a5d858cbc24 service nova] [instance: 1a375c37-fcec-4442-827d-103352e81035] Received event network-vif-deleted-9975edaa-cbba-491b-b8a7-ad6fccdcdf24 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1831.233248] env[59577]: DEBUG nova.compute.manager [req-02d739ae-21f7-4ebf-9353-0edc88a155c5 req-3087ce35-77bb-4883-b03c-fe4a15efd6f8 service nova] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Received event network-vif-deleted-37e13eef-9f40-415c-b500-8af973e59e8e {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1832.476134] env[59577]: DEBUG nova.compute.manager [req-d6c6e339-849d-4aea-be9b-8919da91a5dd req-74c0444b-0984-4785-9026-2c8daef1d07c service nova] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Received event network-vif-deleted-fdc91f1d-4112-4f4c-9fd8-c4c5f35a61ae {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1833.692718] env[59577]: DEBUG nova.compute.manager [req-a8027ecb-63cb-4644-a1fa-819c2f86e8cf req-f99aa81a-5925-4a84-8bd1-aed5c21c8fac service nova] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Received event network-vif-deleted-2096eda8-b035-40d8-bb8c-b2f2fa309777 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1834.509305] env[59577]: DEBUG nova.compute.manager [req-885facac-ef1b-478a-8ee9-ad323964a8ae req-20ad886c-4c23-4e7a-b2df-71be86a9b41f service nova] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Received event network-vif-deleted-b8edabde-eff3-4547-b3c3-53d0e44af941 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1836.335266] env[59577]: DEBUG nova.compute.manager [req-b089bff9-0764-4bc3-9b9e-56df52a147c9 req-d09db52e-8c32-4663-b625-387b0e529d2a service nova] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Received event network-vif-deleted-441a0cf4-0c88-4060-9011-1de3673f29ba {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1838.499426] env[59577]: DEBUG nova.compute.manager [req-b3a9ff1b-113e-4cb5-b051-d0bfd8b4fdc3 req-a174e030-52fe-4380-b9ec-9e84c0177ecb service nova] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Received event network-vif-deleted-1765580f-24fe-4393-8a9a-68a9e29a5370 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1842.603474] env[59577]: WARNING oslo_vmware.rw_handles [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1842.603474] env[59577]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1842.603474] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1842.603474] env[59577]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1842.603474] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1842.603474] env[59577]: ERROR oslo_vmware.rw_handles response.begin() [ 1842.603474] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1842.603474] env[59577]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1842.603474] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1842.603474] env[59577]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1842.603474] env[59577]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1842.603474] env[59577]: ERROR oslo_vmware.rw_handles [ 1842.603474] env[59577]: DEBUG nova.virt.vmwareapi.images [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Downloaded image file data d5e691af-5903-46f6-a589-e220c4e5798c to vmware_temp/72d2580a-2330-46ab-ba4c-8935b06f1651/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk on the data store datastore1 {{(pid=59577) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1842.605087] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Caching image {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1842.606393] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Copying Virtual Disk [datastore1] vmware_temp/72d2580a-2330-46ab-ba4c-8935b06f1651/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk to [datastore1] vmware_temp/72d2580a-2330-46ab-ba4c-8935b06f1651/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk {{(pid=59577) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1842.607126] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f96949d8-b5df-44fa-9756-6c8f6f618a85 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1842.615197] env[59577]: DEBUG oslo_vmware.api [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Waiting for the task: (returnval){ [ 1842.615197] env[59577]: value = "task-1933796" [ 1842.615197] env[59577]: _type = "Task" [ 1842.615197] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1842.624705] env[59577]: DEBUG oslo_vmware.api [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Task: {'id': task-1933796, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1843.126783] env[59577]: DEBUG oslo_vmware.exceptions [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Fault InvalidArgument not matched. {{(pid=59577) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1843.127301] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1843.127940] env[59577]: ERROR nova.compute.manager [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1843.127940] env[59577]: Faults: ['InvalidArgument'] [ 1843.127940] env[59577]: ERROR nova.compute.manager [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Traceback (most recent call last): [ 1843.127940] env[59577]: ERROR nova.compute.manager [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1843.127940] env[59577]: ERROR nova.compute.manager [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] yield resources [ 1843.127940] env[59577]: ERROR nova.compute.manager [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1843.127940] env[59577]: ERROR nova.compute.manager [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] self.driver.spawn(context, instance, image_meta, [ 1843.127940] env[59577]: ERROR nova.compute.manager [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1843.127940] env[59577]: ERROR nova.compute.manager [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1843.127940] env[59577]: ERROR nova.compute.manager [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1843.127940] env[59577]: ERROR nova.compute.manager [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] self._fetch_image_if_missing(context, vi) [ 1843.127940] env[59577]: ERROR nova.compute.manager [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1843.128256] env[59577]: ERROR nova.compute.manager [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] image_cache(vi, tmp_image_ds_loc) [ 1843.128256] env[59577]: ERROR nova.compute.manager [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1843.128256] env[59577]: ERROR nova.compute.manager [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] vm_util.copy_virtual_disk( [ 1843.128256] env[59577]: ERROR nova.compute.manager [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1843.128256] env[59577]: ERROR nova.compute.manager [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] session._wait_for_task(vmdk_copy_task) [ 1843.128256] env[59577]: ERROR nova.compute.manager [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1843.128256] env[59577]: ERROR nova.compute.manager [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] return self.wait_for_task(task_ref) [ 1843.128256] env[59577]: ERROR nova.compute.manager [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1843.128256] env[59577]: ERROR nova.compute.manager [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] return evt.wait() [ 1843.128256] env[59577]: ERROR nova.compute.manager [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1843.128256] env[59577]: ERROR nova.compute.manager [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] result = hub.switch() [ 1843.128256] env[59577]: ERROR nova.compute.manager [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1843.128256] env[59577]: ERROR nova.compute.manager [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] return self.greenlet.switch() [ 1843.128545] env[59577]: ERROR nova.compute.manager [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1843.128545] env[59577]: ERROR nova.compute.manager [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] self.f(*self.args, **self.kw) [ 1843.128545] env[59577]: ERROR nova.compute.manager [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1843.128545] env[59577]: ERROR nova.compute.manager [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] raise exceptions.translate_fault(task_info.error) [ 1843.128545] env[59577]: ERROR nova.compute.manager [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1843.128545] env[59577]: ERROR nova.compute.manager [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Faults: ['InvalidArgument'] [ 1843.128545] env[59577]: ERROR nova.compute.manager [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] [ 1843.128706] env[59577]: INFO nova.compute.manager [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Terminating instance [ 1843.130364] env[59577]: DEBUG oslo_concurrency.lockutils [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1843.130633] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1843.131301] env[59577]: DEBUG nova.compute.manager [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Start destroying the instance on the hypervisor. {{(pid=59577) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1843.131544] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Destroying instance {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1843.131898] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ac49e99a-995b-490d-bdde-8f81d39ce983 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1843.134640] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11ba7729-2c5b-4099-b202-6e61be397f64 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1843.142812] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Unregistering the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1843.143888] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-8edff038-05bb-4e65-93c2-e4fce73c7170 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1843.145499] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1843.145717] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59577) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1843.146414] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d94d8abd-fa04-4ec0-8d16-f84a748b4353 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1843.152115] env[59577]: DEBUG oslo_vmware.api [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Waiting for the task: (returnval){ [ 1843.152115] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]527243fd-17a7-293f-b030-6bf4f408725b" [ 1843.152115] env[59577]: _type = "Task" [ 1843.152115] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1843.160892] env[59577]: DEBUG oslo_vmware.api [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]527243fd-17a7-293f-b030-6bf4f408725b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1843.358624] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Unregistered the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1843.359351] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Deleting contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1843.359602] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Deleting the datastore file [datastore1] cc3276aa-0d5a-4a14-ae90-e20a1b823bd3 {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1843.359924] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4bb95906-17d9-42a0-84c2-df3737bcbbd2 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1843.366793] env[59577]: DEBUG oslo_vmware.api [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Waiting for the task: (returnval){ [ 1843.366793] env[59577]: value = "task-1933798" [ 1843.366793] env[59577]: _type = "Task" [ 1843.366793] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1843.375919] env[59577]: DEBUG oslo_vmware.api [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Task: {'id': task-1933798, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1843.661924] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Preparing fetch location {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1843.662432] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Creating directory with path [datastore1] vmware_temp/bdcb5825-aea8-4325-b4bb-de80d58af8b6/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1843.662685] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ba5e2892-9e70-4aa8-8b52-01cebb22d05b {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1843.675339] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Created directory with path [datastore1] vmware_temp/bdcb5825-aea8-4325-b4bb-de80d58af8b6/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1843.675549] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Fetch image to [datastore1] vmware_temp/bdcb5825-aea8-4325-b4bb-de80d58af8b6/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1843.675733] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to [datastore1] vmware_temp/bdcb5825-aea8-4325-b4bb-de80d58af8b6/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk on the data store datastore1 {{(pid=59577) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1843.676621] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35be7f60-e14c-4994-91cc-639c2acfc15b {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1843.685868] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-333ce730-4ea0-4e17-a5b8-a451eb0a941c {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1843.696548] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12b1d4ca-63e8-4817-b77c-46e07125479b {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1843.732860] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13028add-d109-485c-954b-34fee89d4795 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1843.740260] env[59577]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-87b78900-baf9-4ff2-b3dc-82c4168343c1 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1843.761123] env[59577]: DEBUG nova.virt.vmwareapi.images [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to the data store datastore1 {{(pid=59577) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1843.876871] env[59577]: DEBUG oslo_vmware.api [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Task: {'id': task-1933798, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069594} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1843.877988] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Deleted the datastore file {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1843.878259] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Deleted contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1843.878476] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Instance destroyed {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1843.878816] env[59577]: INFO nova.compute.manager [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Took 0.75 seconds to destroy the instance on the hypervisor. [ 1843.880852] env[59577]: DEBUG nova.compute.claims [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Aborting claim: {{(pid=59577) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1843.881328] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1843.881593] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1843.927079] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.045s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1843.927832] env[59577]: DEBUG nova.compute.utils [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Instance cc3276aa-0d5a-4a14-ae90-e20a1b823bd3 could not be found. {{(pid=59577) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1843.929693] env[59577]: DEBUG nova.compute.manager [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Instance disappeared during build. {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1843.929908] env[59577]: DEBUG nova.compute.manager [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Unplugging VIFs for instance {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1843.930093] env[59577]: DEBUG nova.compute.manager [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1843.930291] env[59577]: DEBUG nova.compute.manager [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Deallocating network for instance {{(pid=59577) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1843.930472] env[59577]: DEBUG nova.network.neutron [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] deallocate_for_instance() {{(pid=59577) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1843.933172] env[59577]: DEBUG oslo_vmware.rw_handles [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/bdcb5825-aea8-4325-b4bb-de80d58af8b6/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59577) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1843.995640] env[59577]: DEBUG oslo_vmware.rw_handles [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Completed reading data from the image iterator. {{(pid=59577) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1843.995824] env[59577]: DEBUG oslo_vmware.rw_handles [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/bdcb5825-aea8-4325-b4bb-de80d58af8b6/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59577) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1844.008143] env[59577]: DEBUG nova.network.neutron [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Updating instance_info_cache with network_info: [] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1844.018969] env[59577]: INFO nova.compute.manager [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Took 0.09 seconds to deallocate network for instance. [ 1844.068477] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5df4bbfd-0015-4947-bd39-b4abdac0a7a1 tempest-DeleteServersTestJSON-1234060116 tempest-DeleteServersTestJSON-1234060116-project-member] Lock "cc3276aa-0d5a-4a14-ae90-e20a1b823bd3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 227.824s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1844.080054] env[59577]: DEBUG nova.compute.manager [None req-9049e4db-20ea-4ff1-a288-b177a2f749ce tempest-ServerActionsTestJSON-1026393023 tempest-ServerActionsTestJSON-1026393023-project-member] [instance: 640c1048-dca1-4fbd-889b-cd8aa23eb3f2] Starting instance... {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1844.108202] env[59577]: DEBUG nova.compute.manager [None req-9049e4db-20ea-4ff1-a288-b177a2f749ce tempest-ServerActionsTestJSON-1026393023 tempest-ServerActionsTestJSON-1026393023-project-member] [instance: 640c1048-dca1-4fbd-889b-cd8aa23eb3f2] Instance disappeared before build. {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 1844.135053] env[59577]: DEBUG oslo_concurrency.lockutils [None req-9049e4db-20ea-4ff1-a288-b177a2f749ce tempest-ServerActionsTestJSON-1026393023 tempest-ServerActionsTestJSON-1026393023-project-member] Lock "640c1048-dca1-4fbd-889b-cd8aa23eb3f2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 204.877s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1844.146431] env[59577]: DEBUG nova.compute.manager [None req-05ebad77-e053-4aa6-9355-404a57c4f831 tempest-AttachVolumeTestJSON-342258731 tempest-AttachVolumeTestJSON-342258731-project-member] [instance: d25dbbad-94bb-4147-aa56-91aaab4ed077] Starting instance... {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1844.175388] env[59577]: DEBUG nova.compute.manager [None req-05ebad77-e053-4aa6-9355-404a57c4f831 tempest-AttachVolumeTestJSON-342258731 tempest-AttachVolumeTestJSON-342258731-project-member] [instance: d25dbbad-94bb-4147-aa56-91aaab4ed077] Instance disappeared before build. {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 1844.204386] env[59577]: DEBUG oslo_concurrency.lockutils [None req-05ebad77-e053-4aa6-9355-404a57c4f831 tempest-AttachVolumeTestJSON-342258731 tempest-AttachVolumeTestJSON-342258731-project-member] Lock "d25dbbad-94bb-4147-aa56-91aaab4ed077" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 204.893s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1844.215700] env[59577]: DEBUG nova.compute.manager [None req-2bc953ea-fe68-497c-b3dd-b82e0ba19246 tempest-ServerDiagnosticsV248Test-1906389932 tempest-ServerDiagnosticsV248Test-1906389932-project-member] [instance: 7b8c78af-c9f9-4bff-a667-6faa0fb7c482] Starting instance... {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1844.249270] env[59577]: DEBUG nova.compute.manager [None req-2bc953ea-fe68-497c-b3dd-b82e0ba19246 tempest-ServerDiagnosticsV248Test-1906389932 tempest-ServerDiagnosticsV248Test-1906389932-project-member] [instance: 7b8c78af-c9f9-4bff-a667-6faa0fb7c482] Instance disappeared before build. {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 1844.288353] env[59577]: DEBUG oslo_concurrency.lockutils [None req-2bc953ea-fe68-497c-b3dd-b82e0ba19246 tempest-ServerDiagnosticsV248Test-1906389932 tempest-ServerDiagnosticsV248Test-1906389932-project-member] Lock "7b8c78af-c9f9-4bff-a667-6faa0fb7c482" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 204.199s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1844.300581] env[59577]: DEBUG nova.compute.manager [None req-2d3a4d0b-cc26-464b-8c28-354c4dcce1d7 tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] [instance: 16a36155-12ef-40b2-b94d-db4619ac2f4b] Starting instance... {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1844.325364] env[59577]: DEBUG nova.compute.manager [None req-2d3a4d0b-cc26-464b-8c28-354c4dcce1d7 tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] [instance: 16a36155-12ef-40b2-b94d-db4619ac2f4b] Instance disappeared before build. {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 1844.349424] env[59577]: DEBUG oslo_concurrency.lockutils [None req-2d3a4d0b-cc26-464b-8c28-354c4dcce1d7 tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Lock "16a36155-12ef-40b2-b94d-db4619ac2f4b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 202.935s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1844.360765] env[59577]: DEBUG nova.compute.manager [None req-0f0396f5-8ca8-48e3-8f50-71e3ccba4a0f tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: cefa22cd-9d7a-4b86-9ec5-bb9f005b42e0] Starting instance... {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1844.390439] env[59577]: DEBUG nova.compute.manager [None req-0f0396f5-8ca8-48e3-8f50-71e3ccba4a0f tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: cefa22cd-9d7a-4b86-9ec5-bb9f005b42e0] Instance disappeared before build. {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 1844.412315] env[59577]: DEBUG oslo_concurrency.lockutils [None req-0f0396f5-8ca8-48e3-8f50-71e3ccba4a0f tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Lock "cefa22cd-9d7a-4b86-9ec5-bb9f005b42e0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 200.175s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1844.424345] env[59577]: DEBUG nova.compute.manager [None req-66f099a0-0d3b-4270-9770-d32e70f14d28 tempest-ServerActionsTestOtherA-455475876 tempest-ServerActionsTestOtherA-455475876-project-member] [instance: eaa781b6-6542-495d-8430-73416444d972] Starting instance... {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1844.454885] env[59577]: DEBUG nova.compute.manager [None req-66f099a0-0d3b-4270-9770-d32e70f14d28 tempest-ServerActionsTestOtherA-455475876 tempest-ServerActionsTestOtherA-455475876-project-member] [instance: eaa781b6-6542-495d-8430-73416444d972] Instance disappeared before build. {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 1844.507080] env[59577]: DEBUG oslo_concurrency.lockutils [None req-66f099a0-0d3b-4270-9770-d32e70f14d28 tempest-ServerActionsTestOtherA-455475876 tempest-ServerActionsTestOtherA-455475876-project-member] Lock "eaa781b6-6542-495d-8430-73416444d972" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 199.073s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1844.518304] env[59577]: DEBUG nova.compute.manager [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Starting instance... {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1844.586111] env[59577]: DEBUG oslo_concurrency.lockutils [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1844.586370] env[59577]: DEBUG oslo_concurrency.lockutils [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1844.588014] env[59577]: INFO nova.compute.claims [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1844.721959] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2397d046-0c25-449c-b746-627d3ce78d2e {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1844.730342] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81d50878-8145-4f55-8ece-77ec0c03223b {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1844.762962] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-524ad957-a890-4833-9306-3fbc5d178a47 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1844.771232] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79e62244-b0be-4fdc-8ebf-38c95d734c2f {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1844.787879] env[59577]: DEBUG nova.compute.provider_tree [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1844.797812] env[59577]: DEBUG nova.scheduler.client.report [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1844.813190] env[59577]: DEBUG oslo_concurrency.lockutils [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.227s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1844.813724] env[59577]: DEBUG nova.compute.manager [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Start building networks asynchronously for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1844.854564] env[59577]: DEBUG nova.compute.utils [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Using /dev/sd instead of None {{(pid=59577) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1844.855970] env[59577]: DEBUG nova.compute.manager [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Allocating IP information in the background. {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1844.856693] env[59577]: DEBUG nova.network.neutron [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] allocate_for_instance() {{(pid=59577) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1844.874665] env[59577]: DEBUG nova.compute.manager [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Start building block device mappings for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1844.947674] env[59577]: DEBUG nova.compute.manager [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Start spawning the instance on the hypervisor. {{(pid=59577) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1844.972772] env[59577]: DEBUG nova.virt.hardware [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-12-17T13:42:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-12-17T13:42:33Z,direct_url=,disk_format='vmdk',id=d5e691af-5903-46f6-a589-e220c4e5798c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5cabf3c5743c484db7095e0ffc0e5d73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-12-17T13:42:34Z,virtual_size=,visibility=), allow threads: False {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1844.973034] env[59577]: DEBUG nova.virt.hardware [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Flavor limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1844.973200] env[59577]: DEBUG nova.virt.hardware [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Image limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1844.973388] env[59577]: DEBUG nova.virt.hardware [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Flavor pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1844.973537] env[59577]: DEBUG nova.virt.hardware [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Image pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1844.973703] env[59577]: DEBUG nova.virt.hardware [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1844.973953] env[59577]: DEBUG nova.virt.hardware [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1844.974123] env[59577]: DEBUG nova.virt.hardware [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1844.974296] env[59577]: DEBUG nova.virt.hardware [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Got 1 possible topologies {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1844.974460] env[59577]: DEBUG nova.virt.hardware [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1844.974627] env[59577]: DEBUG nova.virt.hardware [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1844.975494] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af72cc16-9c1e-415f-b439-cd850b9eb122 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1844.984174] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1d35296-542b-45a3-8c3e-b549ac69481b {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1845.036998] env[59577]: DEBUG nova.policy [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a440ca0a5e514d029b371ca8d4ced034', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '307f941cd86b44e18ebb7482eb6a9257', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59577) authorize /opt/stack/nova/nova/policy.py:203}} [ 1845.614939] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Acquiring lock "5af13348-9f89-44b2-93bd-f9fb91598c73" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1845.615204] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Lock "5af13348-9f89-44b2-93bd-f9fb91598c73" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1846.539266] env[59577]: DEBUG nova.network.neutron [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Successfully created port: 3b1e4882-79a7-48cb-b3b0-36e3958b07eb {{(pid=59577) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1846.914992] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Acquiring lock "56259797-6883-437c-8942-5beca0e1ef7b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1846.915492] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Lock "56259797-6883-437c-8942-5beca0e1ef7b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1847.047312] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1847.047900] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1847.355579] env[59577]: DEBUG oslo_concurrency.lockutils [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Acquiring lock "22339279-c381-4ccb-bba0-b0b554203e60" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1847.357304] env[59577]: DEBUG oslo_concurrency.lockutils [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Lock "22339279-c381-4ccb-bba0-b0b554203e60" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1847.624684] env[59577]: DEBUG oslo_concurrency.lockutils [None req-62a5f427-2b47-4a27-a315-d5791913b31b tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Acquiring lock "8ed18ae2-2ba1-424c-b695-846afd7b3501" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1848.324817] env[59577]: DEBUG nova.compute.manager [req-0149885e-12ce-41ca-8a0d-51d6ca6c65c0 req-c3ac59b1-6226-49d4-8cdc-e22de34f679c service nova] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Received event network-vif-plugged-3b1e4882-79a7-48cb-b3b0-36e3958b07eb {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1848.325523] env[59577]: DEBUG oslo_concurrency.lockutils [req-0149885e-12ce-41ca-8a0d-51d6ca6c65c0 req-c3ac59b1-6226-49d4-8cdc-e22de34f679c service nova] Acquiring lock "8ed18ae2-2ba1-424c-b695-846afd7b3501-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1848.325842] env[59577]: DEBUG oslo_concurrency.lockutils [req-0149885e-12ce-41ca-8a0d-51d6ca6c65c0 req-c3ac59b1-6226-49d4-8cdc-e22de34f679c service nova] Lock "8ed18ae2-2ba1-424c-b695-846afd7b3501-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1848.329532] env[59577]: DEBUG oslo_concurrency.lockutils [req-0149885e-12ce-41ca-8a0d-51d6ca6c65c0 req-c3ac59b1-6226-49d4-8cdc-e22de34f679c service nova] Lock "8ed18ae2-2ba1-424c-b695-846afd7b3501-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1848.329532] env[59577]: DEBUG nova.compute.manager [req-0149885e-12ce-41ca-8a0d-51d6ca6c65c0 req-c3ac59b1-6226-49d4-8cdc-e22de34f679c service nova] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] No waiting events found dispatching network-vif-plugged-3b1e4882-79a7-48cb-b3b0-36e3958b07eb {{(pid=59577) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1848.329532] env[59577]: WARNING nova.compute.manager [req-0149885e-12ce-41ca-8a0d-51d6ca6c65c0 req-c3ac59b1-6226-49d4-8cdc-e22de34f679c service nova] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Received unexpected event network-vif-plugged-3b1e4882-79a7-48cb-b3b0-36e3958b07eb for instance with vm_state building and task_state deleting. [ 1848.396700] env[59577]: DEBUG nova.network.neutron [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Successfully updated port: 3b1e4882-79a7-48cb-b3b0-36e3958b07eb {{(pid=59577) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1848.407732] env[59577]: DEBUG oslo_concurrency.lockutils [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Acquiring lock "refresh_cache-8ed18ae2-2ba1-424c-b695-846afd7b3501" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1848.407892] env[59577]: DEBUG oslo_concurrency.lockutils [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Acquired lock "refresh_cache-8ed18ae2-2ba1-424c-b695-846afd7b3501" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1848.411131] env[59577]: DEBUG nova.network.neutron [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Building network info cache for instance {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1848.465199] env[59577]: DEBUG nova.network.neutron [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Instance cache missing network info. {{(pid=59577) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1848.969904] env[59577]: DEBUG nova.network.neutron [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Updating instance_info_cache with network_info: [{"id": "3b1e4882-79a7-48cb-b3b0-36e3958b07eb", "address": "fa:16:3e:37:3e:f5", "network": {"id": "f2b32a30-bd23-49e6-a9d4-d18f4dbd45f5", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-815472418-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "307f941cd86b44e18ebb7482eb6a9257", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3e05affa-2640-435e-a124-0ee8a6ab1152", "external-id": "nsx-vlan-transportzone-839", "segmentation_id": 839, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3b1e4882-79", "ovs_interfaceid": "3b1e4882-79a7-48cb-b3b0-36e3958b07eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1848.988302] env[59577]: DEBUG oslo_concurrency.lockutils [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Releasing lock "refresh_cache-8ed18ae2-2ba1-424c-b695-846afd7b3501" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1848.988629] env[59577]: DEBUG nova.compute.manager [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Instance network_info: |[{"id": "3b1e4882-79a7-48cb-b3b0-36e3958b07eb", "address": "fa:16:3e:37:3e:f5", "network": {"id": "f2b32a30-bd23-49e6-a9d4-d18f4dbd45f5", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-815472418-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "307f941cd86b44e18ebb7482eb6a9257", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3e05affa-2640-435e-a124-0ee8a6ab1152", "external-id": "nsx-vlan-transportzone-839", "segmentation_id": 839, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3b1e4882-79", "ovs_interfaceid": "3b1e4882-79a7-48cb-b3b0-36e3958b07eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1848.989178] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:37:3e:f5', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3e05affa-2640-435e-a124-0ee8a6ab1152', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3b1e4882-79a7-48cb-b3b0-36e3958b07eb', 'vif_model': 'vmxnet3'}] {{(pid=59577) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1848.996683] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Creating folder: Project (307f941cd86b44e18ebb7482eb6a9257). Parent ref: group-v398749. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1849.000018] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d3305014-f150-4ce0-84e3-a2e2c296b2a6 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1849.008895] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Created folder: Project (307f941cd86b44e18ebb7482eb6a9257) in parent group-v398749. [ 1849.009498] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Creating folder: Instances. Parent ref: group-v398791. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1849.009498] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4ac540fe-ded4-413c-8007-b29fe1d77de5 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1849.021227] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Created folder: Instances in parent group-v398791. [ 1849.021227] env[59577]: DEBUG oslo.service.loopingcall [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59577) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1849.021227] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Creating VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1849.021227] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b5d1b9ca-4890-4625-ba18-ab2198bb5dae {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1849.039046] env[59577]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1849.039046] env[59577]: value = "task-1933801" [ 1849.039046] env[59577]: _type = "Task" [ 1849.039046] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1849.048245] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1849.048245] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933801, 'name': CreateVM_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1849.549795] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933801, 'name': CreateVM_Task, 'duration_secs': 0.353317} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1849.549999] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Created VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1849.550675] env[59577]: DEBUG oslo_concurrency.lockutils [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1849.550837] env[59577]: DEBUG oslo_concurrency.lockutils [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1849.551203] env[59577]: DEBUG oslo_concurrency.lockutils [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1849.551405] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0880b601-cc9c-4623-a218-4763fcf6eed5 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1849.558390] env[59577]: DEBUG oslo_vmware.api [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Waiting for the task: (returnval){ [ 1849.558390] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]5277c013-6544-af91-8009-98fd798803a2" [ 1849.558390] env[59577]: _type = "Task" [ 1849.558390] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1849.565678] env[59577]: DEBUG oslo_vmware.api [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]5277c013-6544-af91-8009-98fd798803a2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1850.067989] env[59577]: DEBUG oslo_concurrency.lockutils [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1850.068263] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Processing image d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1850.068474] env[59577]: DEBUG oslo_concurrency.lockutils [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1850.248790] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Acquiring lock "9c96e4d7-30d3-44fc-b8d0-14271dc19ce3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1850.249076] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Lock "9c96e4d7-30d3-44fc-b8d0-14271dc19ce3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1850.358858] env[59577]: DEBUG nova.compute.manager [req-64519f81-164c-417d-8704-d5861f31b8c3 req-d34e8a33-a248-4fea-b335-eb78de22cf8c service nova] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Received event network-changed-3b1e4882-79a7-48cb-b3b0-36e3958b07eb {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1850.358971] env[59577]: DEBUG nova.compute.manager [req-64519f81-164c-417d-8704-d5861f31b8c3 req-d34e8a33-a248-4fea-b335-eb78de22cf8c service nova] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Refreshing instance network info cache due to event network-changed-3b1e4882-79a7-48cb-b3b0-36e3958b07eb. {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1850.359203] env[59577]: DEBUG oslo_concurrency.lockutils [req-64519f81-164c-417d-8704-d5861f31b8c3 req-d34e8a33-a248-4fea-b335-eb78de22cf8c service nova] Acquiring lock "refresh_cache-8ed18ae2-2ba1-424c-b695-846afd7b3501" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1850.359349] env[59577]: DEBUG oslo_concurrency.lockutils [req-64519f81-164c-417d-8704-d5861f31b8c3 req-d34e8a33-a248-4fea-b335-eb78de22cf8c service nova] Acquired lock "refresh_cache-8ed18ae2-2ba1-424c-b695-846afd7b3501" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1850.359506] env[59577]: DEBUG nova.network.neutron [req-64519f81-164c-417d-8704-d5861f31b8c3 req-d34e8a33-a248-4fea-b335-eb78de22cf8c service nova] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Refreshing network info cache for port 3b1e4882-79a7-48cb-b3b0-36e3958b07eb {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1850.620603] env[59577]: DEBUG nova.network.neutron [req-64519f81-164c-417d-8704-d5861f31b8c3 req-d34e8a33-a248-4fea-b335-eb78de22cf8c service nova] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Updated VIF entry in instance network info cache for port 3b1e4882-79a7-48cb-b3b0-36e3958b07eb. {{(pid=59577) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1850.620603] env[59577]: DEBUG nova.network.neutron [req-64519f81-164c-417d-8704-d5861f31b8c3 req-d34e8a33-a248-4fea-b335-eb78de22cf8c service nova] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Updating instance_info_cache with network_info: [{"id": "3b1e4882-79a7-48cb-b3b0-36e3958b07eb", "address": "fa:16:3e:37:3e:f5", "network": {"id": "f2b32a30-bd23-49e6-a9d4-d18f4dbd45f5", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-815472418-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "307f941cd86b44e18ebb7482eb6a9257", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3e05affa-2640-435e-a124-0ee8a6ab1152", "external-id": "nsx-vlan-transportzone-839", "segmentation_id": 839, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3b1e4882-79", "ovs_interfaceid": "3b1e4882-79a7-48cb-b3b0-36e3958b07eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1850.631147] env[59577]: DEBUG oslo_concurrency.lockutils [req-64519f81-164c-417d-8704-d5861f31b8c3 req-d34e8a33-a248-4fea-b335-eb78de22cf8c service nova] Releasing lock "refresh_cache-8ed18ae2-2ba1-424c-b695-846afd7b3501" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1851.846782] env[59577]: DEBUG oslo_concurrency.lockutils [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] Acquiring lock "e2c0fb3c-6cee-4be4-a368-0fa86a07ff88" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1851.847054] env[59577]: DEBUG oslo_concurrency.lockutils [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] Lock "e2c0fb3c-6cee-4be4-a368-0fa86a07ff88" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1853.044979] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1853.045268] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59577) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1854.040342] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1854.045063] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1854.045346] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Cleaning up deleted instances with incomplete migration {{(pid=59577) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11139}} [ 1855.051748] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1856.045060] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1856.055756] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1856.056040] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1856.056174] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1856.056324] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59577) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1856.057421] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7dc64836-3c9b-4d97-80bd-24af8ade1c4f {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1856.066043] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4383a65a-3fbb-4607-98cc-56c559dbb043 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1856.079459] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36813834-9262-4e44-b119-59d58a0586d5 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1856.085283] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb31fa46-3601-4903-ad0e-5fbec2dc0ddb {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1856.113166] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181326MB free_disk=175GB free_vcpus=48 pci_devices=None {{(pid=59577) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1856.113299] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1856.113477] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1856.149383] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 8ed18ae2-2ba1-424c-b695-846afd7b3501 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1856.158847] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 86651832-02db-4181-8a9e-11da5f017f65 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1856.168645] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance aac8eec6-577b-46d2-9baa-8cf548a6970e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1856.178273] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 5af13348-9f89-44b2-93bd-f9fb91598c73 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1856.188088] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 56259797-6883-437c-8942-5beca0e1ef7b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1856.196507] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 22339279-c381-4ccb-bba0-b0b554203e60 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1856.205071] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 9c96e4d7-30d3-44fc-b8d0-14271dc19ce3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1856.213613] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance e2c0fb3c-6cee-4be4-a368-0fa86a07ff88 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1856.213800] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1856.213946] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1856.310021] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a12ce015-ab10-4c80-82e6-e0af8414cfc5 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1856.315904] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c93a5f10-329e-4613-9e48-eeb2c90cc24f {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1856.346019] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f2815ea-d46e-4847-a433-6119a025fda0 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1856.352877] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b12c7371-c712-4854-9737-fc1bff59aae2 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1856.365607] env[59577]: DEBUG nova.compute.provider_tree [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1856.373621] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1856.386059] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59577) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1856.386238] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.273s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1857.387192] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1858.044711] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1858.044945] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Starting heal instance info cache {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1858.045104] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Rebuilding the list of instances to heal {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1858.055151] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1858.055246] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Didn't find any instances for network info cache update. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1858.594096] env[59577]: DEBUG oslo_concurrency.lockutils [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Acquiring lock "3d63cb9b-3c20-4c34-a96e-29e3dcea65a7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1858.594362] env[59577]: DEBUG oslo_concurrency.lockutils [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Lock "3d63cb9b-3c20-4c34-a96e-29e3dcea65a7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1861.045448] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1861.045716] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Cleaning up deleted instances {{(pid=59577) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11101}} [ 1861.073072] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] There are 10 instances to clean {{(pid=59577) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11110}} [ 1861.073330] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Instance has had 0 of 5 cleanup attempts {{(pid=59577) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1861.109334] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Instance has had 0 of 5 cleanup attempts {{(pid=59577) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1861.145191] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Instance has had 0 of 5 cleanup attempts {{(pid=59577) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1861.163981] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Instance has had 0 of 5 cleanup attempts {{(pid=59577) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1861.182933] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Instance has had 0 of 5 cleanup attempts {{(pid=59577) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1861.202005] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Instance has had 0 of 5 cleanup attempts {{(pid=59577) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1861.221224] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 1a375c37-fcec-4442-827d-103352e81035] Instance has had 0 of 5 cleanup attempts {{(pid=59577) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1861.239341] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Instance has had 0 of 5 cleanup attempts {{(pid=59577) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1861.258225] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Instance has had 0 of 5 cleanup attempts {{(pid=59577) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1861.276438] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: cc3276aa-0d5a-4a14-ae90-e20a1b823bd3] Instance has had 0 of 5 cleanup attempts {{(pid=59577) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1866.293832] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1867.044593] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1881.285534] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._sync_power_states {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1881.297612] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Getting list of instances from cluster (obj){ [ 1881.297612] env[59577]: value = "domain-c8" [ 1881.297612] env[59577]: _type = "ClusterComputeResource" [ 1881.297612] env[59577]: } {{(pid=59577) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1881.298661] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f22fdc76-9ea0-4439-9761-860ea21a1b75 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1881.315949] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Got total of 10 instances {{(pid=59577) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1881.316120] env[59577]: WARNING nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] While synchronizing instance power states, found 1 instances in the database and 10 instances on the hypervisor. [ 1881.316262] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Triggering sync for uuid 8ed18ae2-2ba1-424c-b695-846afd7b3501 {{(pid=59577) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10224}} [ 1881.316582] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "8ed18ae2-2ba1-424c-b695-846afd7b3501" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1891.111122] env[59577]: WARNING oslo_vmware.rw_handles [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1891.111122] env[59577]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1891.111122] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1891.111122] env[59577]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1891.111122] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1891.111122] env[59577]: ERROR oslo_vmware.rw_handles response.begin() [ 1891.111122] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1891.111122] env[59577]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1891.111122] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1891.111122] env[59577]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1891.111122] env[59577]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1891.111122] env[59577]: ERROR oslo_vmware.rw_handles [ 1891.111790] env[59577]: DEBUG nova.virt.vmwareapi.images [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Downloaded image file data d5e691af-5903-46f6-a589-e220c4e5798c to vmware_temp/bdcb5825-aea8-4325-b4bb-de80d58af8b6/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk on the data store datastore1 {{(pid=59577) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1891.113337] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Caching image {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1891.113584] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Copying Virtual Disk [datastore1] vmware_temp/bdcb5825-aea8-4325-b4bb-de80d58af8b6/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk to [datastore1] vmware_temp/bdcb5825-aea8-4325-b4bb-de80d58af8b6/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk {{(pid=59577) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1891.113860] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-e1bc4cdb-ccd4-47e2-8dfa-eee8c49dc0c5 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1891.121237] env[59577]: DEBUG oslo_vmware.api [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Waiting for the task: (returnval){ [ 1891.121237] env[59577]: value = "task-1933802" [ 1891.121237] env[59577]: _type = "Task" [ 1891.121237] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1891.129243] env[59577]: DEBUG oslo_vmware.api [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Task: {'id': task-1933802, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1891.632506] env[59577]: DEBUG oslo_vmware.exceptions [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Fault InvalidArgument not matched. {{(pid=59577) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1891.632727] env[59577]: DEBUG oslo_concurrency.lockutils [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1891.633306] env[59577]: ERROR nova.compute.manager [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1891.633306] env[59577]: Faults: ['InvalidArgument'] [ 1891.633306] env[59577]: ERROR nova.compute.manager [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Traceback (most recent call last): [ 1891.633306] env[59577]: ERROR nova.compute.manager [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1891.633306] env[59577]: ERROR nova.compute.manager [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] yield resources [ 1891.633306] env[59577]: ERROR nova.compute.manager [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1891.633306] env[59577]: ERROR nova.compute.manager [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] self.driver.spawn(context, instance, image_meta, [ 1891.633306] env[59577]: ERROR nova.compute.manager [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1891.633306] env[59577]: ERROR nova.compute.manager [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1891.633306] env[59577]: ERROR nova.compute.manager [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1891.633306] env[59577]: ERROR nova.compute.manager [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] self._fetch_image_if_missing(context, vi) [ 1891.633306] env[59577]: ERROR nova.compute.manager [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1891.633652] env[59577]: ERROR nova.compute.manager [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] image_cache(vi, tmp_image_ds_loc) [ 1891.633652] env[59577]: ERROR nova.compute.manager [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1891.633652] env[59577]: ERROR nova.compute.manager [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] vm_util.copy_virtual_disk( [ 1891.633652] env[59577]: ERROR nova.compute.manager [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1891.633652] env[59577]: ERROR nova.compute.manager [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] session._wait_for_task(vmdk_copy_task) [ 1891.633652] env[59577]: ERROR nova.compute.manager [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1891.633652] env[59577]: ERROR nova.compute.manager [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] return self.wait_for_task(task_ref) [ 1891.633652] env[59577]: ERROR nova.compute.manager [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1891.633652] env[59577]: ERROR nova.compute.manager [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] return evt.wait() [ 1891.633652] env[59577]: ERROR nova.compute.manager [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1891.633652] env[59577]: ERROR nova.compute.manager [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] result = hub.switch() [ 1891.633652] env[59577]: ERROR nova.compute.manager [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1891.633652] env[59577]: ERROR nova.compute.manager [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] return self.greenlet.switch() [ 1891.634050] env[59577]: ERROR nova.compute.manager [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1891.634050] env[59577]: ERROR nova.compute.manager [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] self.f(*self.args, **self.kw) [ 1891.634050] env[59577]: ERROR nova.compute.manager [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1891.634050] env[59577]: ERROR nova.compute.manager [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] raise exceptions.translate_fault(task_info.error) [ 1891.634050] env[59577]: ERROR nova.compute.manager [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1891.634050] env[59577]: ERROR nova.compute.manager [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Faults: ['InvalidArgument'] [ 1891.634050] env[59577]: ERROR nova.compute.manager [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] [ 1891.634050] env[59577]: INFO nova.compute.manager [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Terminating instance [ 1891.635192] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1891.635397] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1891.635641] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a0137bc9-9395-42cf-a5c2-a374d6159b24 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1891.637688] env[59577]: DEBUG nova.compute.manager [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Start destroying the instance on the hypervisor. {{(pid=59577) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1891.637877] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Destroying instance {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1891.638577] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9054da6-c516-4ec8-a918-cfbc2d298456 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1891.645426] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Unregistering the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1891.646311] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c9863a15-2552-469b-8de2-29a89a4a30cd {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1891.647618] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1891.647780] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59577) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1891.648423] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ad7a523b-058a-4ed0-b4ce-2a42d3054d4b {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1891.653244] env[59577]: DEBUG oslo_vmware.api [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Waiting for the task: (returnval){ [ 1891.653244] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]52466610-f383-ffec-7596-711efa8bfe55" [ 1891.653244] env[59577]: _type = "Task" [ 1891.653244] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1891.660369] env[59577]: DEBUG oslo_vmware.api [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]52466610-f383-ffec-7596-711efa8bfe55, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1891.729791] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Unregistered the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1891.730012] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Deleting contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1891.730198] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Deleting the datastore file [datastore1] ee50624e-74d6-4afc-9fba-c541f1b83554 {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1891.730453] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-fe362226-e31a-4455-a756-9cef255d92e5 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1891.737520] env[59577]: DEBUG oslo_vmware.api [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Waiting for the task: (returnval){ [ 1891.737520] env[59577]: value = "task-1933804" [ 1891.737520] env[59577]: _type = "Task" [ 1891.737520] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1891.744635] env[59577]: DEBUG oslo_vmware.api [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Task: {'id': task-1933804, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1892.164773] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Preparing fetch location {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1892.165136] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Creating directory with path [datastore1] vmware_temp/92720e6f-5860-4585-b192-e3edb293ef1f/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1892.165360] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e5c2dd0c-5cdb-4e94-91ef-bd5446a148a7 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1892.177165] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Created directory with path [datastore1] vmware_temp/92720e6f-5860-4585-b192-e3edb293ef1f/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1892.177402] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Fetch image to [datastore1] vmware_temp/92720e6f-5860-4585-b192-e3edb293ef1f/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1892.177640] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to [datastore1] vmware_temp/92720e6f-5860-4585-b192-e3edb293ef1f/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk on the data store datastore1 {{(pid=59577) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1892.178390] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c07824d9-953e-43d9-8ff0-985fb9ef07aa {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1892.185363] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b72c106f-f7a7-4655-b336-e419baf193a8 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1892.194334] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4fecb462-7ec0-481d-8c9f-722dd4e019e3 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1892.225562] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c18a83f-eb8c-4c85-bccb-fe5b825e91d1 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1892.231893] env[59577]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-3b64542e-6f4a-4f99-a686-14846aed91b6 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1892.245359] env[59577]: DEBUG oslo_vmware.api [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Task: {'id': task-1933804, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069546} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1892.245779] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Deleted the datastore file {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1892.245825] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Deleted contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1892.246010] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Instance destroyed {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1892.246212] env[59577]: INFO nova.compute.manager [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1892.248376] env[59577]: DEBUG nova.compute.claims [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Aborting claim: {{(pid=59577) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1892.248545] env[59577]: DEBUG oslo_concurrency.lockutils [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1892.251182] env[59577]: DEBUG oslo_concurrency.lockutils [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1892.254035] env[59577]: DEBUG nova.virt.vmwareapi.images [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to the data store datastore1 {{(pid=59577) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1892.276027] env[59577]: DEBUG oslo_concurrency.lockutils [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.027s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1892.276342] env[59577]: DEBUG nova.compute.utils [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Instance ee50624e-74d6-4afc-9fba-c541f1b83554 could not be found. {{(pid=59577) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1892.278347] env[59577]: DEBUG nova.compute.manager [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Instance disappeared during build. {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1892.278537] env[59577]: DEBUG nova.compute.manager [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Unplugging VIFs for instance {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1892.278777] env[59577]: DEBUG nova.compute.manager [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1892.278968] env[59577]: DEBUG nova.compute.manager [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Deallocating network for instance {{(pid=59577) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1892.279148] env[59577]: DEBUG nova.network.neutron [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] deallocate_for_instance() {{(pid=59577) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1892.305806] env[59577]: DEBUG nova.network.neutron [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Updating instance_info_cache with network_info: [] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1892.315068] env[59577]: INFO nova.compute.manager [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] [instance: ee50624e-74d6-4afc-9fba-c541f1b83554] Took 0.04 seconds to deallocate network for instance. [ 1892.354554] env[59577]: DEBUG oslo_concurrency.lockutils [None req-962587f8-d0e7-43ae-8d4d-a3a30681adc3 tempest-ServersTestFqdnHostnames-148247989 tempest-ServersTestFqdnHostnames-148247989-project-member] Lock "ee50624e-74d6-4afc-9fba-c541f1b83554" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 272.979s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1892.363345] env[59577]: DEBUG nova.compute.manager [None req-e7736f56-1efe-4470-808a-6f14c5fdbc33 tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 659b62e3-0bb2-46fb-aa6a-faef4883bdc1] Starting instance... {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1892.386018] env[59577]: DEBUG nova.compute.manager [None req-e7736f56-1efe-4470-808a-6f14c5fdbc33 tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] [instance: 659b62e3-0bb2-46fb-aa6a-faef4883bdc1] Instance disappeared before build. {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 1892.389064] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1892.390759] env[59577]: ERROR nova.compute.manager [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image d5e691af-5903-46f6-a589-e220c4e5798c. [ 1892.390759] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] Traceback (most recent call last): [ 1892.390759] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1892.390759] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1892.390759] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1892.390759] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] result = getattr(controller, method)(*args, **kwargs) [ 1892.390759] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1892.390759] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] return self._get(image_id) [ 1892.390759] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1892.390759] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1892.390759] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1892.391190] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] resp, body = self.http_client.get(url, headers=header) [ 1892.391190] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1892.391190] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] return self.request(url, 'GET', **kwargs) [ 1892.391190] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1892.391190] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] return self._handle_response(resp) [ 1892.391190] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1892.391190] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] raise exc.from_response(resp, resp.content) [ 1892.391190] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1892.391190] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] [ 1892.391190] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] During handling of the above exception, another exception occurred: [ 1892.391190] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] [ 1892.391190] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] Traceback (most recent call last): [ 1892.391640] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1892.391640] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] yield resources [ 1892.391640] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1892.391640] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] self.driver.spawn(context, instance, image_meta, [ 1892.391640] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1892.391640] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1892.391640] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1892.391640] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] self._fetch_image_if_missing(context, vi) [ 1892.391640] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1892.391640] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] image_fetch(context, vi, tmp_image_ds_loc) [ 1892.391640] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1892.391640] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] images.fetch_image( [ 1892.391640] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1892.391926] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] metadata = IMAGE_API.get(context, image_ref) [ 1892.391926] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1892.391926] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] return session.show(context, image_id, [ 1892.391926] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1892.391926] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] _reraise_translated_image_exception(image_id) [ 1892.391926] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1892.391926] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] raise new_exc.with_traceback(exc_trace) [ 1892.391926] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1892.391926] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1892.391926] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1892.391926] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] result = getattr(controller, method)(*args, **kwargs) [ 1892.391926] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1892.391926] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] return self._get(image_id) [ 1892.392217] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1892.392217] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1892.392217] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1892.392217] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] resp, body = self.http_client.get(url, headers=header) [ 1892.392217] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1892.392217] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] return self.request(url, 'GET', **kwargs) [ 1892.392217] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1892.392217] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] return self._handle_response(resp) [ 1892.392217] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1892.392217] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] raise exc.from_response(resp, resp.content) [ 1892.392217] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] nova.exception.ImageNotAuthorized: Not authorized for image d5e691af-5903-46f6-a589-e220c4e5798c. [ 1892.392217] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] [ 1892.392651] env[59577]: INFO nova.compute.manager [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Terminating instance [ 1892.392697] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1892.392898] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1892.393546] env[59577]: DEBUG nova.compute.manager [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Start destroying the instance on the hypervisor. {{(pid=59577) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1892.393748] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Destroying instance {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1892.394029] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d2b8fb34-b385-4d65-b76f-e731166d6677 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1892.397087] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd5d3f26-d762-414e-b6e8-be76e88f700a {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1892.406582] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Unregistering the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1892.406808] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3347e8c0-bcd7-41de-bf53-0b33bd36fbb0 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1892.409113] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1892.409277] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59577) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1892.410249] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-854a7dba-7f82-452c-9154-a60abe506d7c {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1892.413057] env[59577]: DEBUG oslo_concurrency.lockutils [None req-e7736f56-1efe-4470-808a-6f14c5fdbc33 tempest-DeleteServersAdminTestJSON-138630785 tempest-DeleteServersAdminTestJSON-138630785-project-member] Lock "659b62e3-0bb2-46fb-aa6a-faef4883bdc1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 239.298s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1892.416802] env[59577]: DEBUG oslo_vmware.api [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Waiting for the task: (returnval){ [ 1892.416802] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]5224a9d1-c758-c467-5d02-7bf935498f53" [ 1892.416802] env[59577]: _type = "Task" [ 1892.416802] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1892.424794] env[59577]: DEBUG nova.compute.manager [None req-5ce53f05-537c-4494-b7a4-5c6c39920506 tempest-ServerGroupTestJSON-834758818 tempest-ServerGroupTestJSON-834758818-project-member] [instance: bc569bae-c099-4437-8683-5af75ef5f106] Starting instance... {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1892.446486] env[59577]: DEBUG nova.compute.manager [None req-5ce53f05-537c-4494-b7a4-5c6c39920506 tempest-ServerGroupTestJSON-834758818 tempest-ServerGroupTestJSON-834758818-project-member] [instance: bc569bae-c099-4437-8683-5af75ef5f106] Instance disappeared before build. {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 1892.465581] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5ce53f05-537c-4494-b7a4-5c6c39920506 tempest-ServerGroupTestJSON-834758818 tempest-ServerGroupTestJSON-834758818-project-member] Lock "bc569bae-c099-4437-8683-5af75ef5f106" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 235.497s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1892.473819] env[59577]: DEBUG nova.compute.manager [None req-40841827-bd04-4c51-a0b7-417633b02639 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] [instance: 86651832-02db-4181-8a9e-11da5f017f65] Starting instance... {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1892.486945] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Unregistered the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1892.487281] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Deleting contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1892.487577] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Deleting the datastore file [datastore1] 1a375c37-fcec-4442-827d-103352e81035 {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1892.487891] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4cd8f959-ab8a-4233-a7c6-4bda195adf0c {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1892.495031] env[59577]: DEBUG oslo_vmware.api [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Waiting for the task: (returnval){ [ 1892.495031] env[59577]: value = "task-1933806" [ 1892.495031] env[59577]: _type = "Task" [ 1892.495031] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1892.506115] env[59577]: DEBUG oslo_vmware.api [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Task: {'id': task-1933806, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1892.507018] env[59577]: DEBUG nova.compute.manager [None req-40841827-bd04-4c51-a0b7-417633b02639 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] [instance: 86651832-02db-4181-8a9e-11da5f017f65] Instance disappeared before build. {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 1892.527379] env[59577]: DEBUG oslo_concurrency.lockutils [None req-40841827-bd04-4c51-a0b7-417633b02639 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Lock "86651832-02db-4181-8a9e-11da5f017f65" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 230.514s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1892.538725] env[59577]: DEBUG nova.compute.manager [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Starting instance... {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1892.584410] env[59577]: DEBUG oslo_concurrency.lockutils [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1892.584690] env[59577]: DEBUG oslo_concurrency.lockutils [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1892.586496] env[59577]: INFO nova.compute.claims [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1892.720843] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e314de1-a367-400d-a1cb-c1376463ceab {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1892.728171] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da2c2725-b1df-4cfa-a7d2-805a594590e8 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1892.756828] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-583c6f7b-3d8e-4324-9cb0-1f5eec5f0036 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1892.763588] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a30482ca-61f0-4b26-9828-b96dfd0af8a6 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1892.777011] env[59577]: DEBUG nova.compute.provider_tree [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1892.786226] env[59577]: DEBUG nova.scheduler.client.report [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1892.798494] env[59577]: DEBUG oslo_concurrency.lockutils [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.214s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1892.798972] env[59577]: DEBUG nova.compute.manager [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Start building networks asynchronously for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1892.827564] env[59577]: DEBUG nova.compute.utils [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Using /dev/sd instead of None {{(pid=59577) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1892.828892] env[59577]: DEBUG nova.compute.manager [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Allocating IP information in the background. {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1892.829071] env[59577]: DEBUG nova.network.neutron [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] allocate_for_instance() {{(pid=59577) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1892.838172] env[59577]: DEBUG nova.compute.manager [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Start building block device mappings for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1892.885135] env[59577]: DEBUG nova.policy [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f0edf02bbc614e6eac34fec21f4610ef', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0c9ad1cc1dd840f5b6a9ba5483856d93', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59577) authorize /opt/stack/nova/nova/policy.py:203}} [ 1892.898917] env[59577]: DEBUG nova.compute.manager [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Start spawning the instance on the hypervisor. {{(pid=59577) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1892.919293] env[59577]: DEBUG nova.virt.hardware [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-12-17T13:42:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-12-17T13:42:33Z,direct_url=,disk_format='vmdk',id=d5e691af-5903-46f6-a589-e220c4e5798c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5cabf3c5743c484db7095e0ffc0e5d73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-12-17T13:42:34Z,virtual_size=,visibility=), allow threads: False {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1892.919553] env[59577]: DEBUG nova.virt.hardware [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Flavor limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1892.919728] env[59577]: DEBUG nova.virt.hardware [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Image limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1892.919919] env[59577]: DEBUG nova.virt.hardware [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Flavor pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1892.920077] env[59577]: DEBUG nova.virt.hardware [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Image pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1892.920233] env[59577]: DEBUG nova.virt.hardware [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1892.920445] env[59577]: DEBUG nova.virt.hardware [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1892.920603] env[59577]: DEBUG nova.virt.hardware [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1892.920769] env[59577]: DEBUG nova.virt.hardware [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Got 1 possible topologies {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1892.920946] env[59577]: DEBUG nova.virt.hardware [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1892.921169] env[59577]: DEBUG nova.virt.hardware [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1892.922045] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aea21998-1a0b-4c2f-a632-e7e9a9bd622d {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1892.934745] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Preparing fetch location {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1892.934974] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Creating directory with path [datastore1] vmware_temp/7f095728-5814-43b1-a922-d294f5b8b544/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1892.935234] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-10dae286-a44e-441d-93a6-8aba9add74fc {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1892.938046] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5e2ef4b-2e7d-4ffb-9e9b-6d018553b3cf {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1892.952423] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Created directory with path [datastore1] vmware_temp/7f095728-5814-43b1-a922-d294f5b8b544/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1892.952614] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Fetch image to [datastore1] vmware_temp/7f095728-5814-43b1-a922-d294f5b8b544/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1892.952787] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to [datastore1] vmware_temp/7f095728-5814-43b1-a922-d294f5b8b544/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk on the data store datastore1 {{(pid=59577) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1892.953526] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21ecca99-87b4-4957-b47d-3c12fb9cea20 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1892.959849] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e01f0042-87d9-41c8-929f-94d8f2a04164 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1892.969064] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-deb7c8d0-4170-4713-8f50-1f170b2946f1 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1893.002835] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52dd6c35-8aac-4622-83aa-511003f2fe84 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1893.009872] env[59577]: DEBUG oslo_vmware.api [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Task: {'id': task-1933806, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.078644} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1893.011356] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Deleted the datastore file {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1893.011542] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Deleted contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1893.011709] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Instance destroyed {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1893.011877] env[59577]: INFO nova.compute.manager [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1893.013963] env[59577]: DEBUG nova.compute.claims [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Aborting claim: {{(pid=59577) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1893.015338] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1893.015338] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1893.017500] env[59577]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7ddffa7a-cc77-48c4-a598-98149162c07a {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1893.048762] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.034s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1893.049508] env[59577]: DEBUG nova.compute.utils [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Instance 1a375c37-fcec-4442-827d-103352e81035 could not be found. {{(pid=59577) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1893.051255] env[59577]: DEBUG nova.compute.manager [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Instance disappeared during build. {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1893.051425] env[59577]: DEBUG nova.compute.manager [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Unplugging VIFs for instance {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1893.051590] env[59577]: DEBUG nova.compute.manager [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1893.051759] env[59577]: DEBUG nova.compute.manager [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Deallocating network for instance {{(pid=59577) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1893.051916] env[59577]: DEBUG nova.network.neutron [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] deallocate_for_instance() {{(pid=59577) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1893.108435] env[59577]: DEBUG nova.virt.vmwareapi.images [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to the data store datastore1 {{(pid=59577) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1893.209690] env[59577]: DEBUG oslo_vmware.rw_handles [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7f095728-5814-43b1-a922-d294f5b8b544/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59577) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1893.267219] env[59577]: DEBUG neutronclient.v2_0.client [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59577) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1893.271565] env[59577]: ERROR nova.compute.manager [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] [instance: 1a375c37-fcec-4442-827d-103352e81035] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1893.271565] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] Traceback (most recent call last): [ 1893.271565] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1893.271565] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1893.271565] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1893.271565] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] result = getattr(controller, method)(*args, **kwargs) [ 1893.271565] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1893.271565] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] return self._get(image_id) [ 1893.271565] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1893.271565] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1893.271565] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1893.271565] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] resp, body = self.http_client.get(url, headers=header) [ 1893.271941] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1893.271941] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] return self.request(url, 'GET', **kwargs) [ 1893.271941] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1893.271941] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] return self._handle_response(resp) [ 1893.271941] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1893.271941] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] raise exc.from_response(resp, resp.content) [ 1893.271941] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1893.271941] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] [ 1893.271941] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] During handling of the above exception, another exception occurred: [ 1893.271941] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] [ 1893.271941] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] Traceback (most recent call last): [ 1893.271941] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1893.272239] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] self.driver.spawn(context, instance, image_meta, [ 1893.272239] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1893.272239] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1893.272239] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1893.272239] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] self._fetch_image_if_missing(context, vi) [ 1893.272239] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1893.272239] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] image_fetch(context, vi, tmp_image_ds_loc) [ 1893.272239] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1893.272239] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] images.fetch_image( [ 1893.272239] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1893.272239] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] metadata = IMAGE_API.get(context, image_ref) [ 1893.272239] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1893.272239] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] return session.show(context, image_id, [ 1893.272619] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1893.272619] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] _reraise_translated_image_exception(image_id) [ 1893.272619] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1893.272619] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] raise new_exc.with_traceback(exc_trace) [ 1893.272619] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1893.272619] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1893.272619] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1893.272619] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] result = getattr(controller, method)(*args, **kwargs) [ 1893.272619] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1893.272619] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] return self._get(image_id) [ 1893.272619] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1893.272619] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1893.272619] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1893.272944] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] resp, body = self.http_client.get(url, headers=header) [ 1893.272944] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1893.272944] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] return self.request(url, 'GET', **kwargs) [ 1893.272944] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1893.272944] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] return self._handle_response(resp) [ 1893.272944] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1893.272944] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] raise exc.from_response(resp, resp.content) [ 1893.272944] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] nova.exception.ImageNotAuthorized: Not authorized for image d5e691af-5903-46f6-a589-e220c4e5798c. [ 1893.272944] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] [ 1893.272944] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] During handling of the above exception, another exception occurred: [ 1893.272944] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] [ 1893.272944] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] Traceback (most recent call last): [ 1893.272944] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1893.273272] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] self._build_and_run_instance(context, instance, image, [ 1893.273272] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1893.273272] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] with excutils.save_and_reraise_exception(): [ 1893.273272] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1893.273272] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] self.force_reraise() [ 1893.273272] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1893.273272] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] raise self.value [ 1893.273272] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1893.273272] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] with self.rt.instance_claim(context, instance, node, allocs, [ 1893.273272] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1893.273272] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] self.abort() [ 1893.273272] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1893.273272] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1893.273688] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1893.273688] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] return f(*args, **kwargs) [ 1893.273688] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1893.273688] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] self._unset_instance_host_and_node(instance) [ 1893.273688] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1893.273688] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] instance.save() [ 1893.273688] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1893.273688] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] updates, result = self.indirection_api.object_action( [ 1893.273688] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1893.273688] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] return cctxt.call(context, 'object_action', objinst=objinst, [ 1893.273688] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1893.273688] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] result = self.transport._send( [ 1893.274380] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1893.274380] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] return self._driver.send(target, ctxt, message, [ 1893.274380] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1893.274380] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1893.274380] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1893.274380] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] raise result [ 1893.274380] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] nova.exception_Remote.InstanceNotFound_Remote: Instance 1a375c37-fcec-4442-827d-103352e81035 could not be found. [ 1893.274380] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] Traceback (most recent call last): [ 1893.274380] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] [ 1893.274380] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1893.274380] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] return getattr(target, method)(*args, **kwargs) [ 1893.274380] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] [ 1893.274380] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1893.274890] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] return fn(self, *args, **kwargs) [ 1893.274890] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] [ 1893.274890] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1893.274890] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] old_ref, inst_ref = db.instance_update_and_get_original( [ 1893.274890] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] [ 1893.274890] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1893.274890] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] return f(*args, **kwargs) [ 1893.274890] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] [ 1893.274890] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1893.274890] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] with excutils.save_and_reraise_exception() as ectxt: [ 1893.274890] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] [ 1893.274890] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1893.274890] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] self.force_reraise() [ 1893.274890] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] [ 1893.274890] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1893.275357] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] raise self.value [ 1893.275357] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] [ 1893.275357] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1893.275357] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] return f(*args, **kwargs) [ 1893.275357] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] [ 1893.275357] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1893.275357] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] return f(context, *args, **kwargs) [ 1893.275357] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] [ 1893.275357] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1893.275357] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1893.275357] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] [ 1893.275357] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1893.275357] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] raise exception.InstanceNotFound(instance_id=uuid) [ 1893.275357] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] [ 1893.275357] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] nova.exception.InstanceNotFound: Instance 1a375c37-fcec-4442-827d-103352e81035 could not be found. [ 1893.275741] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] [ 1893.275741] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] [ 1893.275741] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] During handling of the above exception, another exception occurred: [ 1893.275741] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] [ 1893.275741] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] Traceback (most recent call last): [ 1893.275741] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1893.275741] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] ret = obj(*args, **kwargs) [ 1893.275741] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1893.275741] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] exception_handler_v20(status_code, error_body) [ 1893.275741] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1893.275741] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] raise client_exc(message=error_message, [ 1893.275741] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1893.275741] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] Neutron server returns request_ids: ['req-98737a64-aefa-41cf-9f14-e3caad42ab5e'] [ 1893.275741] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] [ 1893.276123] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] During handling of the above exception, another exception occurred: [ 1893.276123] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] [ 1893.276123] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] Traceback (most recent call last): [ 1893.276123] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1893.276123] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] self._deallocate_network(context, instance, requested_networks) [ 1893.276123] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1893.276123] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] self.network_api.deallocate_for_instance( [ 1893.276123] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1893.276123] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] data = neutron.list_ports(**search_opts) [ 1893.276123] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1893.276123] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] ret = obj(*args, **kwargs) [ 1893.276123] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1893.276123] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] return self.list('ports', self.ports_path, retrieve_all, [ 1893.276461] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1893.276461] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] ret = obj(*args, **kwargs) [ 1893.276461] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1893.276461] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] for r in self._pagination(collection, path, **params): [ 1893.276461] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1893.276461] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] res = self.get(path, params=params) [ 1893.276461] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1893.276461] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] ret = obj(*args, **kwargs) [ 1893.276461] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1893.276461] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] return self.retry_request("GET", action, body=body, [ 1893.276461] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1893.276461] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] ret = obj(*args, **kwargs) [ 1893.276461] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1893.276770] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] return self.do_request(method, action, body=body, [ 1893.276770] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1893.276770] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] ret = obj(*args, **kwargs) [ 1893.276770] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1893.276770] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] self._handle_fault_response(status_code, replybody, resp) [ 1893.276770] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1893.276770] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] raise exception.Unauthorized() [ 1893.276770] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] nova.exception.Unauthorized: Not authorized. [ 1893.276770] env[59577]: ERROR nova.compute.manager [instance: 1a375c37-fcec-4442-827d-103352e81035] [ 1893.276770] env[59577]: DEBUG nova.network.neutron [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Successfully created port: b39a43ff-1ec7-49c5-9e5b-17b008629544 {{(pid=59577) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1893.280878] env[59577]: DEBUG oslo_vmware.rw_handles [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Completed reading data from the image iterator. {{(pid=59577) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1893.280878] env[59577]: DEBUG oslo_vmware.rw_handles [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7f095728-5814-43b1-a922-d294f5b8b544/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59577) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1893.312926] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3caa5cac-bb82-4077-85ff-9a473df22fa5 tempest-SecurityGroupsTestJSON-259355471 tempest-SecurityGroupsTestJSON-259355471-project-member] Lock "1a375c37-fcec-4442-827d-103352e81035" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 267.303s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1893.325067] env[59577]: DEBUG nova.compute.manager [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Starting instance... {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1893.388843] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1893.389121] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1893.390737] env[59577]: INFO nova.compute.claims [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1893.535556] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ab95845-ffd6-402e-9042-d9f597a0c6b1 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1893.541581] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a18c5d63-8848-4806-8e77-6604092e00a3 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1893.571495] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5e38a66-0fe2-43b8-9900-60f0320a7f92 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1893.579768] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05426d87-e609-4297-b59c-e471e9c8a82c {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1893.594642] env[59577]: DEBUG nova.compute.provider_tree [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1893.602985] env[59577]: DEBUG nova.scheduler.client.report [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1893.616176] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.227s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1893.616681] env[59577]: DEBUG nova.compute.manager [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Start building networks asynchronously for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1893.651228] env[59577]: DEBUG nova.compute.utils [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Using /dev/sd instead of None {{(pid=59577) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1893.652757] env[59577]: DEBUG nova.compute.manager [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Allocating IP information in the background. {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1893.652757] env[59577]: DEBUG nova.network.neutron [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] allocate_for_instance() {{(pid=59577) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1893.661520] env[59577]: DEBUG nova.compute.manager [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Start building block device mappings for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1893.723606] env[59577]: DEBUG nova.compute.manager [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Start spawning the instance on the hypervisor. {{(pid=59577) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1893.739655] env[59577]: DEBUG nova.policy [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '45e7d805f0624c0e953bf3dd1e243332', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5cfb462e54c147038f2fa700ce7b4a41', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59577) authorize /opt/stack/nova/nova/policy.py:203}} [ 1893.743840] env[59577]: DEBUG nova.virt.hardware [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-12-17T13:42:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-12-17T13:42:33Z,direct_url=,disk_format='vmdk',id=d5e691af-5903-46f6-a589-e220c4e5798c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5cabf3c5743c484db7095e0ffc0e5d73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-12-17T13:42:34Z,virtual_size=,visibility=), allow threads: False {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1893.744084] env[59577]: DEBUG nova.virt.hardware [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Flavor limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1893.744245] env[59577]: DEBUG nova.virt.hardware [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Image limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1893.744424] env[59577]: DEBUG nova.virt.hardware [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Flavor pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1893.744569] env[59577]: DEBUG nova.virt.hardware [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Image pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1893.744792] env[59577]: DEBUG nova.virt.hardware [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1893.744999] env[59577]: DEBUG nova.virt.hardware [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1893.745169] env[59577]: DEBUG nova.virt.hardware [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1893.745330] env[59577]: DEBUG nova.virt.hardware [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Got 1 possible topologies {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1893.745503] env[59577]: DEBUG nova.virt.hardware [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1893.745675] env[59577]: DEBUG nova.virt.hardware [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1893.746518] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e436ac58-d01c-468b-a7bc-139e905f8431 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1893.754206] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22f12716-4aca-49e2-a40a-cf748c3c3962 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1894.055495] env[59577]: DEBUG nova.compute.manager [req-8c32e894-86d4-4223-a8ba-707f67e592d5 req-aa741294-7753-432a-88ac-3d944b13cdfe service nova] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Received event network-vif-plugged-b39a43ff-1ec7-49c5-9e5b-17b008629544 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1894.055751] env[59577]: DEBUG oslo_concurrency.lockutils [req-8c32e894-86d4-4223-a8ba-707f67e592d5 req-aa741294-7753-432a-88ac-3d944b13cdfe service nova] Acquiring lock "aac8eec6-577b-46d2-9baa-8cf548a6970e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1894.055969] env[59577]: DEBUG oslo_concurrency.lockutils [req-8c32e894-86d4-4223-a8ba-707f67e592d5 req-aa741294-7753-432a-88ac-3d944b13cdfe service nova] Lock "aac8eec6-577b-46d2-9baa-8cf548a6970e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1894.056149] env[59577]: DEBUG oslo_concurrency.lockutils [req-8c32e894-86d4-4223-a8ba-707f67e592d5 req-aa741294-7753-432a-88ac-3d944b13cdfe service nova] Lock "aac8eec6-577b-46d2-9baa-8cf548a6970e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1894.056315] env[59577]: DEBUG nova.compute.manager [req-8c32e894-86d4-4223-a8ba-707f67e592d5 req-aa741294-7753-432a-88ac-3d944b13cdfe service nova] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] No waiting events found dispatching network-vif-plugged-b39a43ff-1ec7-49c5-9e5b-17b008629544 {{(pid=59577) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1894.056476] env[59577]: WARNING nova.compute.manager [req-8c32e894-86d4-4223-a8ba-707f67e592d5 req-aa741294-7753-432a-88ac-3d944b13cdfe service nova] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Received unexpected event network-vif-plugged-b39a43ff-1ec7-49c5-9e5b-17b008629544 for instance with vm_state building and task_state spawning. [ 1894.220509] env[59577]: DEBUG nova.network.neutron [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Successfully updated port: b39a43ff-1ec7-49c5-9e5b-17b008629544 {{(pid=59577) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1894.230430] env[59577]: DEBUG oslo_concurrency.lockutils [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Acquiring lock "refresh_cache-aac8eec6-577b-46d2-9baa-8cf548a6970e" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1894.230582] env[59577]: DEBUG oslo_concurrency.lockutils [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Acquired lock "refresh_cache-aac8eec6-577b-46d2-9baa-8cf548a6970e" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1894.230735] env[59577]: DEBUG nova.network.neutron [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Building network info cache for instance {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1894.289549] env[59577]: DEBUG nova.network.neutron [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Instance cache missing network info. {{(pid=59577) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1894.499552] env[59577]: DEBUG nova.network.neutron [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Successfully created port: ddd4e384-c685-4d4d-b892-53e96a0bb7b0 {{(pid=59577) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1894.573341] env[59577]: DEBUG nova.network.neutron [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Updating instance_info_cache with network_info: [{"id": "b39a43ff-1ec7-49c5-9e5b-17b008629544", "address": "fa:16:3e:2c:88:66", "network": {"id": "ed1e671e-cb6d-4b30-b469-e2b0e91786f8", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-483626843-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0c9ad1cc1dd840f5b6a9ba5483856d93", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3f4a795c-8718-4a7c-aafe-9da231df10f8", "external-id": "nsx-vlan-transportzone-162", "segmentation_id": 162, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb39a43ff-1e", "ovs_interfaceid": "b39a43ff-1ec7-49c5-9e5b-17b008629544", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1894.585016] env[59577]: DEBUG oslo_concurrency.lockutils [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Releasing lock "refresh_cache-aac8eec6-577b-46d2-9baa-8cf548a6970e" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1894.585497] env[59577]: DEBUG nova.compute.manager [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Instance network_info: |[{"id": "b39a43ff-1ec7-49c5-9e5b-17b008629544", "address": "fa:16:3e:2c:88:66", "network": {"id": "ed1e671e-cb6d-4b30-b469-e2b0e91786f8", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-483626843-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0c9ad1cc1dd840f5b6a9ba5483856d93", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3f4a795c-8718-4a7c-aafe-9da231df10f8", "external-id": "nsx-vlan-transportzone-162", "segmentation_id": 162, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb39a43ff-1e", "ovs_interfaceid": "b39a43ff-1ec7-49c5-9e5b-17b008629544", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1894.586341] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:2c:88:66', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3f4a795c-8718-4a7c-aafe-9da231df10f8', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b39a43ff-1ec7-49c5-9e5b-17b008629544', 'vif_model': 'vmxnet3'}] {{(pid=59577) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1894.599955] env[59577]: DEBUG oslo.service.loopingcall [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59577) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1894.600708] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Creating VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1894.601069] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-931bc819-d151-4600-a60a-ddc70d41872b {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1894.633600] env[59577]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1894.633600] env[59577]: value = "task-1933807" [ 1894.633600] env[59577]: _type = "Task" [ 1894.633600] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1894.646167] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933807, 'name': CreateVM_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1895.144671] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933807, 'name': CreateVM_Task, 'duration_secs': 0.327077} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1895.144866] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Created VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1895.145609] env[59577]: DEBUG oslo_concurrency.lockutils [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1895.145819] env[59577]: DEBUG oslo_concurrency.lockutils [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1895.146129] env[59577]: DEBUG oslo_concurrency.lockutils [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1895.146373] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-00af6cca-2782-4f39-9296-aaf5f05f0fa0 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1895.151316] env[59577]: DEBUG oslo_vmware.api [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Waiting for the task: (returnval){ [ 1895.151316] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]52defa28-321d-37f8-0555-842c7d0909ba" [ 1895.151316] env[59577]: _type = "Task" [ 1895.151316] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1895.161101] env[59577]: DEBUG oslo_vmware.api [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]52defa28-321d-37f8-0555-842c7d0909ba, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1895.215439] env[59577]: DEBUG nova.network.neutron [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Successfully updated port: ddd4e384-c685-4d4d-b892-53e96a0bb7b0 {{(pid=59577) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1895.224284] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Acquiring lock "refresh_cache-5af13348-9f89-44b2-93bd-f9fb91598c73" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1895.224662] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Acquired lock "refresh_cache-5af13348-9f89-44b2-93bd-f9fb91598c73" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1895.224662] env[59577]: DEBUG nova.network.neutron [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Building network info cache for instance {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1895.287689] env[59577]: DEBUG nova.network.neutron [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Instance cache missing network info. {{(pid=59577) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1895.505014] env[59577]: DEBUG nova.network.neutron [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Updating instance_info_cache with network_info: [{"id": "ddd4e384-c685-4d4d-b892-53e96a0bb7b0", "address": "fa:16:3e:60:93:7b", "network": {"id": "72553244-f04d-40c1-b0fc-e29645fa419f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-950188364-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5cfb462e54c147038f2fa700ce7b4a41", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5f60c972-a72d-4c5f-a250-faadfd6eafbe", "external-id": "nsx-vlan-transportzone-932", "segmentation_id": 932, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapddd4e384-c6", "ovs_interfaceid": "ddd4e384-c685-4d4d-b892-53e96a0bb7b0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1895.517293] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Releasing lock "refresh_cache-5af13348-9f89-44b2-93bd-f9fb91598c73" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1895.517782] env[59577]: DEBUG nova.compute.manager [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Instance network_info: |[{"id": "ddd4e384-c685-4d4d-b892-53e96a0bb7b0", "address": "fa:16:3e:60:93:7b", "network": {"id": "72553244-f04d-40c1-b0fc-e29645fa419f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-950188364-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5cfb462e54c147038f2fa700ce7b4a41", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5f60c972-a72d-4c5f-a250-faadfd6eafbe", "external-id": "nsx-vlan-transportzone-932", "segmentation_id": 932, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapddd4e384-c6", "ovs_interfaceid": "ddd4e384-c685-4d4d-b892-53e96a0bb7b0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1895.518776] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:60:93:7b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '5f60c972-a72d-4c5f-a250-faadfd6eafbe', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ddd4e384-c685-4d4d-b892-53e96a0bb7b0', 'vif_model': 'vmxnet3'}] {{(pid=59577) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1895.526044] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Creating folder: Project (5cfb462e54c147038f2fa700ce7b4a41). Parent ref: group-v398749. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1895.526589] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-46919722-d30b-47bf-966e-c66fb818ff31 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1895.537666] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Created folder: Project (5cfb462e54c147038f2fa700ce7b4a41) in parent group-v398749. [ 1895.537837] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Creating folder: Instances. Parent ref: group-v398795. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1895.538032] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fa09bec8-d5f7-49d9-b542-2fa60869cef9 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1895.547343] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Created folder: Instances in parent group-v398795. [ 1895.547562] env[59577]: DEBUG oslo.service.loopingcall [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59577) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1895.547758] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Creating VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1895.547952] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b1b27135-aaa9-499e-8a02-5ab681489eaa {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1895.566041] env[59577]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1895.566041] env[59577]: value = "task-1933810" [ 1895.566041] env[59577]: _type = "Task" [ 1895.566041] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1895.573286] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933810, 'name': CreateVM_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1895.662273] env[59577]: DEBUG oslo_concurrency.lockutils [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1895.662530] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Processing image d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1895.662770] env[59577]: DEBUG oslo_concurrency.lockutils [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1896.075679] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933810, 'name': CreateVM_Task, 'duration_secs': 0.311944} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1896.075913] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Created VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1896.076591] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1896.076757] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1896.077084] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1896.077322] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-90a0c02e-48ef-4aad-83ad-be46c47ffa89 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1896.081649] env[59577]: DEBUG oslo_vmware.api [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Waiting for the task: (returnval){ [ 1896.081649] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]5283781a-5fba-23fe-75d1-29385f1f795d" [ 1896.081649] env[59577]: _type = "Task" [ 1896.081649] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1896.083966] env[59577]: DEBUG nova.compute.manager [req-5b18c747-4657-421a-a35c-291f0031ba9d req-36817130-98e8-4606-954e-2e5c66cfe313 service nova] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Received event network-changed-b39a43ff-1ec7-49c5-9e5b-17b008629544 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1896.084180] env[59577]: DEBUG nova.compute.manager [req-5b18c747-4657-421a-a35c-291f0031ba9d req-36817130-98e8-4606-954e-2e5c66cfe313 service nova] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Refreshing instance network info cache due to event network-changed-b39a43ff-1ec7-49c5-9e5b-17b008629544. {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1896.084346] env[59577]: DEBUG oslo_concurrency.lockutils [req-5b18c747-4657-421a-a35c-291f0031ba9d req-36817130-98e8-4606-954e-2e5c66cfe313 service nova] Acquiring lock "refresh_cache-aac8eec6-577b-46d2-9baa-8cf548a6970e" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1896.084488] env[59577]: DEBUG oslo_concurrency.lockutils [req-5b18c747-4657-421a-a35c-291f0031ba9d req-36817130-98e8-4606-954e-2e5c66cfe313 service nova] Acquired lock "refresh_cache-aac8eec6-577b-46d2-9baa-8cf548a6970e" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1896.084642] env[59577]: DEBUG nova.network.neutron [req-5b18c747-4657-421a-a35c-291f0031ba9d req-36817130-98e8-4606-954e-2e5c66cfe313 service nova] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Refreshing network info cache for port b39a43ff-1ec7-49c5-9e5b-17b008629544 {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1896.093337] env[59577]: DEBUG oslo_vmware.api [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]5283781a-5fba-23fe-75d1-29385f1f795d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1896.318463] env[59577]: DEBUG nova.network.neutron [req-5b18c747-4657-421a-a35c-291f0031ba9d req-36817130-98e8-4606-954e-2e5c66cfe313 service nova] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Updated VIF entry in instance network info cache for port b39a43ff-1ec7-49c5-9e5b-17b008629544. {{(pid=59577) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1896.318819] env[59577]: DEBUG nova.network.neutron [req-5b18c747-4657-421a-a35c-291f0031ba9d req-36817130-98e8-4606-954e-2e5c66cfe313 service nova] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Updating instance_info_cache with network_info: [{"id": "b39a43ff-1ec7-49c5-9e5b-17b008629544", "address": "fa:16:3e:2c:88:66", "network": {"id": "ed1e671e-cb6d-4b30-b469-e2b0e91786f8", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-483626843-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0c9ad1cc1dd840f5b6a9ba5483856d93", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3f4a795c-8718-4a7c-aafe-9da231df10f8", "external-id": "nsx-vlan-transportzone-162", "segmentation_id": 162, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb39a43ff-1e", "ovs_interfaceid": "b39a43ff-1ec7-49c5-9e5b-17b008629544", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1896.327748] env[59577]: DEBUG oslo_concurrency.lockutils [req-5b18c747-4657-421a-a35c-291f0031ba9d req-36817130-98e8-4606-954e-2e5c66cfe313 service nova] Releasing lock "refresh_cache-aac8eec6-577b-46d2-9baa-8cf548a6970e" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1896.327985] env[59577]: DEBUG nova.compute.manager [req-5b18c747-4657-421a-a35c-291f0031ba9d req-36817130-98e8-4606-954e-2e5c66cfe313 service nova] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Received event network-vif-plugged-ddd4e384-c685-4d4d-b892-53e96a0bb7b0 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1896.328193] env[59577]: DEBUG oslo_concurrency.lockutils [req-5b18c747-4657-421a-a35c-291f0031ba9d req-36817130-98e8-4606-954e-2e5c66cfe313 service nova] Acquiring lock "5af13348-9f89-44b2-93bd-f9fb91598c73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1896.328387] env[59577]: DEBUG oslo_concurrency.lockutils [req-5b18c747-4657-421a-a35c-291f0031ba9d req-36817130-98e8-4606-954e-2e5c66cfe313 service nova] Lock "5af13348-9f89-44b2-93bd-f9fb91598c73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1896.328544] env[59577]: DEBUG oslo_concurrency.lockutils [req-5b18c747-4657-421a-a35c-291f0031ba9d req-36817130-98e8-4606-954e-2e5c66cfe313 service nova] Lock "5af13348-9f89-44b2-93bd-f9fb91598c73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1896.328704] env[59577]: DEBUG nova.compute.manager [req-5b18c747-4657-421a-a35c-291f0031ba9d req-36817130-98e8-4606-954e-2e5c66cfe313 service nova] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] No waiting events found dispatching network-vif-plugged-ddd4e384-c685-4d4d-b892-53e96a0bb7b0 {{(pid=59577) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1896.328867] env[59577]: WARNING nova.compute.manager [req-5b18c747-4657-421a-a35c-291f0031ba9d req-36817130-98e8-4606-954e-2e5c66cfe313 service nova] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Received unexpected event network-vif-plugged-ddd4e384-c685-4d4d-b892-53e96a0bb7b0 for instance with vm_state building and task_state spawning. [ 1896.329069] env[59577]: DEBUG nova.compute.manager [req-5b18c747-4657-421a-a35c-291f0031ba9d req-36817130-98e8-4606-954e-2e5c66cfe313 service nova] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Received event network-changed-ddd4e384-c685-4d4d-b892-53e96a0bb7b0 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1896.329248] env[59577]: DEBUG nova.compute.manager [req-5b18c747-4657-421a-a35c-291f0031ba9d req-36817130-98e8-4606-954e-2e5c66cfe313 service nova] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Refreshing instance network info cache due to event network-changed-ddd4e384-c685-4d4d-b892-53e96a0bb7b0. {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1896.329432] env[59577]: DEBUG oslo_concurrency.lockutils [req-5b18c747-4657-421a-a35c-291f0031ba9d req-36817130-98e8-4606-954e-2e5c66cfe313 service nova] Acquiring lock "refresh_cache-5af13348-9f89-44b2-93bd-f9fb91598c73" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1896.329568] env[59577]: DEBUG oslo_concurrency.lockutils [req-5b18c747-4657-421a-a35c-291f0031ba9d req-36817130-98e8-4606-954e-2e5c66cfe313 service nova] Acquired lock "refresh_cache-5af13348-9f89-44b2-93bd-f9fb91598c73" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1896.329719] env[59577]: DEBUG nova.network.neutron [req-5b18c747-4657-421a-a35c-291f0031ba9d req-36817130-98e8-4606-954e-2e5c66cfe313 service nova] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Refreshing network info cache for port ddd4e384-c685-4d4d-b892-53e96a0bb7b0 {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1896.593140] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1896.593410] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Processing image d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1896.593656] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1896.608220] env[59577]: DEBUG nova.network.neutron [req-5b18c747-4657-421a-a35c-291f0031ba9d req-36817130-98e8-4606-954e-2e5c66cfe313 service nova] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Updated VIF entry in instance network info cache for port ddd4e384-c685-4d4d-b892-53e96a0bb7b0. {{(pid=59577) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1896.608642] env[59577]: DEBUG nova.network.neutron [req-5b18c747-4657-421a-a35c-291f0031ba9d req-36817130-98e8-4606-954e-2e5c66cfe313 service nova] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Updating instance_info_cache with network_info: [{"id": "ddd4e384-c685-4d4d-b892-53e96a0bb7b0", "address": "fa:16:3e:60:93:7b", "network": {"id": "72553244-f04d-40c1-b0fc-e29645fa419f", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-950188364-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5cfb462e54c147038f2fa700ce7b4a41", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5f60c972-a72d-4c5f-a250-faadfd6eafbe", "external-id": "nsx-vlan-transportzone-932", "segmentation_id": 932, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapddd4e384-c6", "ovs_interfaceid": "ddd4e384-c685-4d4d-b892-53e96a0bb7b0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1896.617791] env[59577]: DEBUG oslo_concurrency.lockutils [req-5b18c747-4657-421a-a35c-291f0031ba9d req-36817130-98e8-4606-954e-2e5c66cfe313 service nova] Releasing lock "refresh_cache-5af13348-9f89-44b2-93bd-f9fb91598c73" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1907.077615] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1908.045171] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1911.044728] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1915.039740] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1915.044449] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1915.044449] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59577) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1916.045527] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1916.055276] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1916.055485] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1916.055656] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1916.055804] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59577) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1916.056881] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc6bacd7-6ca9-47c6-99fe-ae83e244589f {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1916.065404] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e144243-6454-4eeb-af6b-4ac06575b65c {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1916.078804] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52d2ba08-9a71-4dbe-833e-2f5e2a68eaa4 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1916.084741] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-946d7e89-7e60-4df2-8d99-3ee19938cccd {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1916.112815] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181304MB free_disk=175GB free_vcpus=48 pci_devices=None {{(pid=59577) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1916.112909] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1916.113113] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1916.217647] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 8ed18ae2-2ba1-424c-b695-846afd7b3501 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1916.217813] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance aac8eec6-577b-46d2-9baa-8cf548a6970e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1916.217944] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 5af13348-9f89-44b2-93bd-f9fb91598c73 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1916.228616] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 56259797-6883-437c-8942-5beca0e1ef7b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1916.239164] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 22339279-c381-4ccb-bba0-b0b554203e60 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1916.247803] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 9c96e4d7-30d3-44fc-b8d0-14271dc19ce3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1916.256237] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance e2c0fb3c-6cee-4be4-a368-0fa86a07ff88 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1916.264658] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 3d63cb9b-3c20-4c34-a96e-29e3dcea65a7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1916.264853] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Total usable vcpus: 48, total allocated vcpus: 3 {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1916.264997] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=896MB phys_disk=200GB used_disk=3GB total_vcpus=48 used_vcpus=3 pci_stats=[] {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1916.280084] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Refreshing inventories for resource provider cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1916.294031] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Updating ProviderTree inventory for provider cbad7164-1dca-4b60-b95b-712603801988 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1916.294031] env[59577]: DEBUG nova.compute.provider_tree [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Updating inventory in ProviderTree for provider cbad7164-1dca-4b60-b95b-712603801988 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1916.303166] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Refreshing aggregate associations for resource provider cbad7164-1dca-4b60-b95b-712603801988, aggregates: None {{(pid=59577) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1916.317411] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Refreshing trait associations for resource provider cbad7164-1dca-4b60-b95b-712603801988, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE {{(pid=59577) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1916.413047] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa4881fc-10b8-4b7e-9a94-ef6b9d1734eb {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1916.422018] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a27a7b8b-07d4-4a96-b077-dbc0a152818f {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1916.450971] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6887847f-e81b-4a5a-97cd-ecd4fc62249d {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1916.459150] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9294d8a7-72ff-4207-8b7d-b5662b1767ff {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1916.471823] env[59577]: DEBUG nova.compute.provider_tree [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1916.484909] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1916.498105] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59577) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1916.498301] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.385s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1917.498373] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1917.498627] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1920.045473] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1920.045858] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Starting heal instance info cache {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1920.045858] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Rebuilding the list of instances to heal {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1920.058148] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1920.058316] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1920.058453] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1920.058583] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Didn't find any instances for network info cache update. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1939.952443] env[59577]: WARNING oslo_vmware.rw_handles [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1939.952443] env[59577]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1939.952443] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1939.952443] env[59577]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1939.952443] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1939.952443] env[59577]: ERROR oslo_vmware.rw_handles response.begin() [ 1939.952443] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1939.952443] env[59577]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1939.952443] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1939.952443] env[59577]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1939.952443] env[59577]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1939.952443] env[59577]: ERROR oslo_vmware.rw_handles [ 1939.953110] env[59577]: DEBUG nova.virt.vmwareapi.images [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Downloaded image file data d5e691af-5903-46f6-a589-e220c4e5798c to vmware_temp/7f095728-5814-43b1-a922-d294f5b8b544/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk on the data store datastore1 {{(pid=59577) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1939.954581] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Caching image {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1939.954828] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Copying Virtual Disk [datastore1] vmware_temp/7f095728-5814-43b1-a922-d294f5b8b544/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk to [datastore1] vmware_temp/7f095728-5814-43b1-a922-d294f5b8b544/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk {{(pid=59577) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1939.955110] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-04517aba-c091-40a9-8438-c18bf52bd6e3 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1939.963587] env[59577]: DEBUG oslo_vmware.api [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Waiting for the task: (returnval){ [ 1939.963587] env[59577]: value = "task-1933811" [ 1939.963587] env[59577]: _type = "Task" [ 1939.963587] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1939.971860] env[59577]: DEBUG oslo_vmware.api [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Task: {'id': task-1933811, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1940.475452] env[59577]: DEBUG oslo_vmware.exceptions [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Fault InvalidArgument not matched. {{(pid=59577) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1940.475673] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1940.476307] env[59577]: ERROR nova.compute.manager [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1940.476307] env[59577]: Faults: ['InvalidArgument'] [ 1940.476307] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Traceback (most recent call last): [ 1940.476307] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1940.476307] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] yield resources [ 1940.476307] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1940.476307] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] self.driver.spawn(context, instance, image_meta, [ 1940.476307] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1940.476307] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1940.476307] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1940.476307] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] self._fetch_image_if_missing(context, vi) [ 1940.476307] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1940.476668] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] image_cache(vi, tmp_image_ds_loc) [ 1940.476668] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1940.476668] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] vm_util.copy_virtual_disk( [ 1940.476668] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1940.476668] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] session._wait_for_task(vmdk_copy_task) [ 1940.476668] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1940.476668] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] return self.wait_for_task(task_ref) [ 1940.476668] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1940.476668] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] return evt.wait() [ 1940.476668] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1940.476668] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] result = hub.switch() [ 1940.476668] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1940.476668] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] return self.greenlet.switch() [ 1940.477047] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1940.477047] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] self.f(*self.args, **self.kw) [ 1940.477047] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1940.477047] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] raise exceptions.translate_fault(task_info.error) [ 1940.477047] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1940.477047] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Faults: ['InvalidArgument'] [ 1940.477047] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] [ 1940.477047] env[59577]: INFO nova.compute.manager [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Terminating instance [ 1940.478161] env[59577]: DEBUG oslo_concurrency.lockutils [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1940.478378] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1940.478626] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c6b02379-89a2-4666-81ee-e9aff32192ba {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1940.481536] env[59577]: DEBUG nova.compute.manager [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Start destroying the instance on the hypervisor. {{(pid=59577) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1940.481662] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Destroying instance {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1940.482370] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e64d3368-42d7-45b9-b219-4be265547570 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1940.489206] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Unregistering the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1940.489400] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-cca4461a-0c22-4466-82a6-2624194a7787 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1940.491406] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1940.491579] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59577) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1940.492492] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e79e88ac-2e8b-400a-a30e-90fb9fa5bff7 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1940.497132] env[59577]: DEBUG oslo_vmware.api [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Waiting for the task: (returnval){ [ 1940.497132] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]525c35e2-fb3d-f664-1ba3-7b2b0d978865" [ 1940.497132] env[59577]: _type = "Task" [ 1940.497132] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1940.504336] env[59577]: DEBUG oslo_vmware.api [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]525c35e2-fb3d-f664-1ba3-7b2b0d978865, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1940.568512] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Unregistered the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1940.568738] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Deleting contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1940.568918] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Deleting the datastore file [datastore1] 47aac36c-3f70-40a8-ab60-cebba86d3f85 {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1940.569189] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-839ea06d-3d1a-40ce-8ee5-058c86b8f82e {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1940.574846] env[59577]: DEBUG oslo_vmware.api [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Waiting for the task: (returnval){ [ 1940.574846] env[59577]: value = "task-1933813" [ 1940.574846] env[59577]: _type = "Task" [ 1940.574846] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1940.582555] env[59577]: DEBUG oslo_vmware.api [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Task: {'id': task-1933813, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1941.007064] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Preparing fetch location {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1941.008026] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Creating directory with path [datastore1] vmware_temp/b65494ff-19d4-4df4-a1e5-95300e4cd70d/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1941.008026] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-183f115a-6f67-45c6-860e-0dcb53c97309 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1941.036053] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Created directory with path [datastore1] vmware_temp/b65494ff-19d4-4df4-a1e5-95300e4cd70d/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1941.036149] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Fetch image to [datastore1] vmware_temp/b65494ff-19d4-4df4-a1e5-95300e4cd70d/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1941.036303] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to [datastore1] vmware_temp/b65494ff-19d4-4df4-a1e5-95300e4cd70d/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk on the data store datastore1 {{(pid=59577) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1941.037097] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e09ded60-e51a-48d1-8345-ba1e902f2302 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1941.043918] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-def94345-a05e-4ff1-a585-ddd722860e5b {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1941.052848] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d738e69f-4758-401f-b3b2-e9a26ca31284 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1941.084998] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc0829e0-2a95-477a-8ede-399d9f6b91c6 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1941.092058] env[59577]: DEBUG oslo_vmware.api [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Task: {'id': task-1933813, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07606} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1941.093458] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Deleted the datastore file {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1941.093646] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Deleted contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1941.093819] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Instance destroyed {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1941.093987] env[59577]: INFO nova.compute.manager [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1941.095694] env[59577]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-4fadf61b-f1de-4f11-ba9b-54144d4b77dc {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1941.097496] env[59577]: DEBUG nova.compute.claims [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Aborting claim: {{(pid=59577) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1941.097665] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1941.097877] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1941.119526] env[59577]: DEBUG nova.virt.vmwareapi.images [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to the data store datastore1 {{(pid=59577) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1941.130318] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.032s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1941.130892] env[59577]: DEBUG nova.compute.utils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Instance 47aac36c-3f70-40a8-ab60-cebba86d3f85 could not be found. {{(pid=59577) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1941.132356] env[59577]: DEBUG nova.compute.manager [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Instance disappeared during build. {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1941.132590] env[59577]: DEBUG nova.compute.manager [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Unplugging VIFs for instance {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1941.132769] env[59577]: DEBUG nova.compute.manager [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1941.132988] env[59577]: DEBUG nova.compute.manager [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Deallocating network for instance {{(pid=59577) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1941.133180] env[59577]: DEBUG nova.network.neutron [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] deallocate_for_instance() {{(pid=59577) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1941.157930] env[59577]: DEBUG neutronclient.v2_0.client [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59577) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1941.159297] env[59577]: ERROR nova.compute.manager [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1941.159297] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Traceback (most recent call last): [ 1941.159297] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1941.159297] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] self.driver.spawn(context, instance, image_meta, [ 1941.159297] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1941.159297] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1941.159297] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1941.159297] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] self._fetch_image_if_missing(context, vi) [ 1941.159297] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1941.159297] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] image_cache(vi, tmp_image_ds_loc) [ 1941.159297] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1941.159297] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] vm_util.copy_virtual_disk( [ 1941.159764] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1941.159764] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] session._wait_for_task(vmdk_copy_task) [ 1941.159764] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1941.159764] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] return self.wait_for_task(task_ref) [ 1941.159764] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1941.159764] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] return evt.wait() [ 1941.159764] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1941.159764] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] result = hub.switch() [ 1941.159764] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1941.159764] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] return self.greenlet.switch() [ 1941.159764] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1941.159764] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] self.f(*self.args, **self.kw) [ 1941.159764] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1941.160063] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] raise exceptions.translate_fault(task_info.error) [ 1941.160063] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1941.160063] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Faults: ['InvalidArgument'] [ 1941.160063] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] [ 1941.160063] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] During handling of the above exception, another exception occurred: [ 1941.160063] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] [ 1941.160063] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Traceback (most recent call last): [ 1941.160063] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1941.160063] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] self._build_and_run_instance(context, instance, image, [ 1941.160063] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1941.160063] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] with excutils.save_and_reraise_exception(): [ 1941.160063] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1941.160063] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] self.force_reraise() [ 1941.160063] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1941.160386] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] raise self.value [ 1941.160386] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1941.160386] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] with self.rt.instance_claim(context, instance, node, allocs, [ 1941.160386] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1941.160386] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] self.abort() [ 1941.160386] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1941.160386] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1941.160386] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1941.160386] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] return f(*args, **kwargs) [ 1941.160386] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1941.160386] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] self._unset_instance_host_and_node(instance) [ 1941.160386] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1941.160386] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] instance.save() [ 1941.160765] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1941.160765] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] updates, result = self.indirection_api.object_action( [ 1941.160765] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1941.160765] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] return cctxt.call(context, 'object_action', objinst=objinst, [ 1941.160765] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1941.160765] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] result = self.transport._send( [ 1941.160765] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1941.160765] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] return self._driver.send(target, ctxt, message, [ 1941.160765] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1941.160765] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1941.160765] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1941.160765] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] raise result [ 1941.161243] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] nova.exception_Remote.InstanceNotFound_Remote: Instance 47aac36c-3f70-40a8-ab60-cebba86d3f85 could not be found. [ 1941.161243] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Traceback (most recent call last): [ 1941.161243] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] [ 1941.161243] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1941.161243] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] return getattr(target, method)(*args, **kwargs) [ 1941.161243] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] [ 1941.161243] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1941.161243] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] return fn(self, *args, **kwargs) [ 1941.161243] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] [ 1941.161243] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1941.161243] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] old_ref, inst_ref = db.instance_update_and_get_original( [ 1941.161243] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] [ 1941.161243] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1941.161243] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] return f(*args, **kwargs) [ 1941.161243] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] [ 1941.161783] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1941.161783] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] with excutils.save_and_reraise_exception() as ectxt: [ 1941.161783] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] [ 1941.161783] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1941.161783] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] self.force_reraise() [ 1941.161783] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] [ 1941.161783] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1941.161783] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] raise self.value [ 1941.161783] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] [ 1941.161783] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1941.161783] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] return f(*args, **kwargs) [ 1941.161783] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] [ 1941.161783] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1941.161783] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] return f(context, *args, **kwargs) [ 1941.161783] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] [ 1941.162364] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1941.162364] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1941.162364] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] [ 1941.162364] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1941.162364] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] raise exception.InstanceNotFound(instance_id=uuid) [ 1941.162364] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] [ 1941.162364] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] nova.exception.InstanceNotFound: Instance 47aac36c-3f70-40a8-ab60-cebba86d3f85 could not be found. [ 1941.162364] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] [ 1941.162364] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] [ 1941.162364] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] During handling of the above exception, another exception occurred: [ 1941.162364] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] [ 1941.162364] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Traceback (most recent call last): [ 1941.162364] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1941.162364] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] ret = obj(*args, **kwargs) [ 1941.162364] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1941.162964] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] exception_handler_v20(status_code, error_body) [ 1941.162964] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1941.162964] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] raise client_exc(message=error_message, [ 1941.162964] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1941.162964] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Neutron server returns request_ids: ['req-36094d98-c9ca-4818-8f0b-59d18f83a723'] [ 1941.162964] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] [ 1941.162964] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] During handling of the above exception, another exception occurred: [ 1941.162964] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] [ 1941.162964] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] Traceback (most recent call last): [ 1941.162964] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1941.162964] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] self._deallocate_network(context, instance, requested_networks) [ 1941.162964] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1941.162964] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] self.network_api.deallocate_for_instance( [ 1941.163309] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1941.163309] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] data = neutron.list_ports(**search_opts) [ 1941.163309] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1941.163309] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] ret = obj(*args, **kwargs) [ 1941.163309] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1941.163309] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] return self.list('ports', self.ports_path, retrieve_all, [ 1941.163309] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1941.163309] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] ret = obj(*args, **kwargs) [ 1941.163309] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1941.163309] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] for r in self._pagination(collection, path, **params): [ 1941.163309] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1941.163309] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] res = self.get(path, params=params) [ 1941.163309] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1941.163601] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] ret = obj(*args, **kwargs) [ 1941.163601] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1941.163601] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] return self.retry_request("GET", action, body=body, [ 1941.163601] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1941.163601] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] ret = obj(*args, **kwargs) [ 1941.163601] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1941.163601] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] return self.do_request(method, action, body=body, [ 1941.163601] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1941.163601] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] ret = obj(*args, **kwargs) [ 1941.163601] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1941.163601] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] self._handle_fault_response(status_code, replybody, resp) [ 1941.163601] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1941.163601] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] raise exception.Unauthorized() [ 1941.163950] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] nova.exception.Unauthorized: Not authorized. [ 1941.163950] env[59577]: ERROR nova.compute.manager [instance: 47aac36c-3f70-40a8-ab60-cebba86d3f85] [ 1941.180178] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Lock "47aac36c-3f70-40a8-ab60-cebba86d3f85" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 311.926s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1941.189313] env[59577]: DEBUG nova.compute.manager [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: 56259797-6883-437c-8942-5beca0e1ef7b] Starting instance... {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1941.214618] env[59577]: DEBUG oslo_concurrency.lockutils [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1941.215373] env[59577]: ERROR nova.compute.manager [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image d5e691af-5903-46f6-a589-e220c4e5798c. [ 1941.215373] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Traceback (most recent call last): [ 1941.215373] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1941.215373] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1941.215373] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1941.215373] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] result = getattr(controller, method)(*args, **kwargs) [ 1941.215373] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1941.215373] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] return self._get(image_id) [ 1941.215373] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1941.215373] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1941.215373] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1941.215744] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] resp, body = self.http_client.get(url, headers=header) [ 1941.215744] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1941.215744] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] return self.request(url, 'GET', **kwargs) [ 1941.215744] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1941.215744] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] return self._handle_response(resp) [ 1941.215744] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1941.215744] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] raise exc.from_response(resp, resp.content) [ 1941.215744] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1941.215744] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] [ 1941.215744] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] During handling of the above exception, another exception occurred: [ 1941.215744] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] [ 1941.215744] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Traceback (most recent call last): [ 1941.216024] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1941.216024] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] yield resources [ 1941.216024] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1941.216024] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] self.driver.spawn(context, instance, image_meta, [ 1941.216024] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1941.216024] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1941.216024] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1941.216024] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] self._fetch_image_if_missing(context, vi) [ 1941.216024] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1941.216024] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] image_fetch(context, vi, tmp_image_ds_loc) [ 1941.216024] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1941.216024] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] images.fetch_image( [ 1941.216024] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1941.216392] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] metadata = IMAGE_API.get(context, image_ref) [ 1941.216392] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1941.216392] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] return session.show(context, image_id, [ 1941.216392] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1941.216392] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] _reraise_translated_image_exception(image_id) [ 1941.216392] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1941.216392] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] raise new_exc.with_traceback(exc_trace) [ 1941.216392] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1941.216392] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1941.216392] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1941.216392] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] result = getattr(controller, method)(*args, **kwargs) [ 1941.216392] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1941.216392] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] return self._get(image_id) [ 1941.216696] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1941.216696] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1941.216696] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1941.216696] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] resp, body = self.http_client.get(url, headers=header) [ 1941.216696] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1941.216696] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] return self.request(url, 'GET', **kwargs) [ 1941.216696] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1941.216696] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] return self._handle_response(resp) [ 1941.216696] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1941.216696] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] raise exc.from_response(resp, resp.content) [ 1941.216696] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] nova.exception.ImageNotAuthorized: Not authorized for image d5e691af-5903-46f6-a589-e220c4e5798c. [ 1941.216696] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] [ 1941.217312] env[59577]: INFO nova.compute.manager [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Terminating instance [ 1941.217312] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1941.217312] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1941.217549] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d989c914-6e90-4da8-9fe8-436553d0e4c2 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1941.221370] env[59577]: DEBUG nova.compute.manager [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Start destroying the instance on the hypervisor. {{(pid=59577) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1941.221557] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Destroying instance {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1941.226064] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6623dff6-1574-4208-9b35-eec9a85a24c3 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1941.230129] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1941.230304] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59577) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1941.231335] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0d183394-cf8f-4d11-8a7d-bf8f512fd0f9 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1941.235650] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Unregistering the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1941.236160] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-62998b10-48e2-4b12-8c32-e339b9134797 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1941.238628] env[59577]: DEBUG oslo_vmware.api [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Waiting for the task: (returnval){ [ 1941.238628] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]529acf7b-e3b3-9074-015f-6e8e23eabaa5" [ 1941.238628] env[59577]: _type = "Task" [ 1941.238628] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1941.239410] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1941.239651] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1941.241019] env[59577]: INFO nova.compute.claims [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: 56259797-6883-437c-8942-5beca0e1ef7b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1941.251090] env[59577]: DEBUG oslo_vmware.api [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]529acf7b-e3b3-9074-015f-6e8e23eabaa5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1941.304354] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Unregistered the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1941.304595] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Deleting contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1941.304756] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Deleting the datastore file [datastore1] e9b9f5db-afac-494e-9850-c0d82f26fc68 {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1941.305020] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-acc4e537-c3c5-43e5-aebc-9abc5a549ffc {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1941.311365] env[59577]: DEBUG oslo_vmware.api [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Waiting for the task: (returnval){ [ 1941.311365] env[59577]: value = "task-1933815" [ 1941.311365] env[59577]: _type = "Task" [ 1941.311365] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1941.319044] env[59577]: DEBUG oslo_vmware.api [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Task: {'id': task-1933815, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1941.377532] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5d9e1cc-c83d-4b53-9e26-0da4e3f6b511 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1941.384571] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c68ba1b2-cef4-40ac-94fa-67abe8b585b2 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1941.413818] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8707099-555f-4eb1-a6d9-d1f1c4a8af48 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1941.420587] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28043475-b679-496c-9276-1b09d536ffc8 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1941.434596] env[59577]: DEBUG nova.compute.provider_tree [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1941.443448] env[59577]: DEBUG nova.scheduler.client.report [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1941.455762] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.216s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1941.456221] env[59577]: DEBUG nova.compute.manager [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: 56259797-6883-437c-8942-5beca0e1ef7b] Start building networks asynchronously for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1941.484830] env[59577]: DEBUG nova.compute.utils [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Using /dev/sd instead of None {{(pid=59577) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1941.486262] env[59577]: DEBUG nova.compute.manager [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: 56259797-6883-437c-8942-5beca0e1ef7b] Allocating IP information in the background. {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1941.486434] env[59577]: DEBUG nova.network.neutron [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: 56259797-6883-437c-8942-5beca0e1ef7b] allocate_for_instance() {{(pid=59577) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1941.495481] env[59577]: DEBUG nova.compute.manager [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: 56259797-6883-437c-8942-5beca0e1ef7b] Start building block device mappings for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1941.543438] env[59577]: DEBUG nova.policy [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '17f8660c9c09492bba48e650519012cf', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8a8bdbc34699435bbde5622db4df613f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59577) authorize /opt/stack/nova/nova/policy.py:203}} [ 1941.563749] env[59577]: DEBUG nova.compute.manager [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: 56259797-6883-437c-8942-5beca0e1ef7b] Start spawning the instance on the hypervisor. {{(pid=59577) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1941.584693] env[59577]: DEBUG nova.virt.hardware [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-12-17T13:42:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-12-17T13:42:33Z,direct_url=,disk_format='vmdk',id=d5e691af-5903-46f6-a589-e220c4e5798c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5cabf3c5743c484db7095e0ffc0e5d73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-12-17T13:42:34Z,virtual_size=,visibility=), allow threads: False {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1941.584925] env[59577]: DEBUG nova.virt.hardware [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Flavor limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1941.585093] env[59577]: DEBUG nova.virt.hardware [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Image limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1941.585277] env[59577]: DEBUG nova.virt.hardware [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Flavor pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1941.585420] env[59577]: DEBUG nova.virt.hardware [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Image pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1941.585566] env[59577]: DEBUG nova.virt.hardware [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1941.585770] env[59577]: DEBUG nova.virt.hardware [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1941.585925] env[59577]: DEBUG nova.virt.hardware [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1941.586100] env[59577]: DEBUG nova.virt.hardware [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Got 1 possible topologies {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1941.586261] env[59577]: DEBUG nova.virt.hardware [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1941.586437] env[59577]: DEBUG nova.virt.hardware [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1941.587292] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48ad30bf-ffcf-41e0-9ab2-253dabcb589f {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1941.595245] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-378f06ea-5e3a-441d-99db-632f6030ef84 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1941.754551] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Preparing fetch location {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1941.754808] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Creating directory with path [datastore1] vmware_temp/56cb5b38-554e-46b6-88f4-31e32e1dbf43/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1941.755045] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b89d6a6c-76bd-4d5d-a289-9fb5f18e9be8 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1941.766224] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Created directory with path [datastore1] vmware_temp/56cb5b38-554e-46b6-88f4-31e32e1dbf43/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1941.766417] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Fetch image to [datastore1] vmware_temp/56cb5b38-554e-46b6-88f4-31e32e1dbf43/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1941.766591] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to [datastore1] vmware_temp/56cb5b38-554e-46b6-88f4-31e32e1dbf43/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk on the data store datastore1 {{(pid=59577) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1941.767471] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-478d9342-872d-4191-a477-fe5555b2b4dc {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1941.774323] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-902f711c-4dad-4eda-adef-263f1fedc799 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1941.783410] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eae3eff1-1828-4815-8086-5a79b9c8934c {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1941.817936] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3d8a2d3-337f-4c9d-81c8-c9df9fbf9818 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1941.826142] env[59577]: DEBUG oslo_vmware.api [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Task: {'id': task-1933815, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073747} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1941.826810] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Deleted the datastore file {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1941.826994] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Deleted contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1941.827177] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Instance destroyed {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1941.827342] env[59577]: INFO nova.compute.manager [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1941.829203] env[59577]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f98a70a0-abe5-4abc-82de-3e8b0c178ca9 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1941.832341] env[59577]: DEBUG nova.compute.claims [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Aborting claim: {{(pid=59577) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1941.832341] env[59577]: DEBUG oslo_concurrency.lockutils [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1941.832341] env[59577]: DEBUG oslo_concurrency.lockutils [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1941.853820] env[59577]: DEBUG nova.virt.vmwareapi.images [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to the data store datastore1 {{(pid=59577) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1941.874657] env[59577]: DEBUG oslo_concurrency.lockutils [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.042s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1941.874828] env[59577]: DEBUG nova.compute.utils [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Instance e9b9f5db-afac-494e-9850-c0d82f26fc68 could not be found. {{(pid=59577) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1941.877206] env[59577]: DEBUG nova.compute.manager [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Instance disappeared during build. {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1941.877206] env[59577]: DEBUG nova.compute.manager [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Unplugging VIFs for instance {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1941.877206] env[59577]: DEBUG nova.compute.manager [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1941.877206] env[59577]: DEBUG nova.compute.manager [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Deallocating network for instance {{(pid=59577) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1941.877206] env[59577]: DEBUG nova.network.neutron [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] deallocate_for_instance() {{(pid=59577) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1941.884838] env[59577]: DEBUG nova.network.neutron [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: 56259797-6883-437c-8942-5beca0e1ef7b] Successfully created port: 63f4eeb6-1ad0-4da2-a8ac-8ff233044606 {{(pid=59577) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1941.912927] env[59577]: DEBUG neutronclient.v2_0.client [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59577) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1941.914298] env[59577]: ERROR nova.compute.manager [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1941.914298] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Traceback (most recent call last): [ 1941.914298] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1941.914298] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1941.914298] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1941.914298] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] result = getattr(controller, method)(*args, **kwargs) [ 1941.914298] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1941.914298] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] return self._get(image_id) [ 1941.914298] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1941.914298] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1941.914298] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1941.914298] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] resp, body = self.http_client.get(url, headers=header) [ 1941.914645] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1941.914645] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] return self.request(url, 'GET', **kwargs) [ 1941.914645] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1941.914645] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] return self._handle_response(resp) [ 1941.914645] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1941.914645] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] raise exc.from_response(resp, resp.content) [ 1941.914645] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1941.914645] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] [ 1941.914645] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] During handling of the above exception, another exception occurred: [ 1941.914645] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] [ 1941.914645] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Traceback (most recent call last): [ 1941.914645] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1941.915024] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] self.driver.spawn(context, instance, image_meta, [ 1941.915024] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1941.915024] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1941.915024] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1941.915024] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] self._fetch_image_if_missing(context, vi) [ 1941.915024] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1941.915024] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] image_fetch(context, vi, tmp_image_ds_loc) [ 1941.915024] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1941.915024] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] images.fetch_image( [ 1941.915024] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1941.915024] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] metadata = IMAGE_API.get(context, image_ref) [ 1941.915024] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1941.915024] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] return session.show(context, image_id, [ 1941.915310] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1941.915310] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] _reraise_translated_image_exception(image_id) [ 1941.915310] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1941.915310] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] raise new_exc.with_traceback(exc_trace) [ 1941.915310] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1941.915310] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1941.915310] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1941.915310] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] result = getattr(controller, method)(*args, **kwargs) [ 1941.915310] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1941.915310] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] return self._get(image_id) [ 1941.915310] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1941.915310] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1941.915310] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1941.915580] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] resp, body = self.http_client.get(url, headers=header) [ 1941.915580] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1941.915580] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] return self.request(url, 'GET', **kwargs) [ 1941.915580] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1941.915580] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] return self._handle_response(resp) [ 1941.915580] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1941.915580] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] raise exc.from_response(resp, resp.content) [ 1941.915580] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] nova.exception.ImageNotAuthorized: Not authorized for image d5e691af-5903-46f6-a589-e220c4e5798c. [ 1941.915580] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] [ 1941.915580] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] During handling of the above exception, another exception occurred: [ 1941.915580] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] [ 1941.915580] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Traceback (most recent call last): [ 1941.915580] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1941.915854] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] self._build_and_run_instance(context, instance, image, [ 1941.915854] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1941.915854] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] with excutils.save_and_reraise_exception(): [ 1941.915854] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1941.915854] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] self.force_reraise() [ 1941.915854] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1941.915854] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] raise self.value [ 1941.915854] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1941.915854] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] with self.rt.instance_claim(context, instance, node, allocs, [ 1941.915854] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1941.915854] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] self.abort() [ 1941.915854] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1941.915854] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1941.916157] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1941.916157] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] return f(*args, **kwargs) [ 1941.916157] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1941.916157] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] self._unset_instance_host_and_node(instance) [ 1941.916157] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1941.916157] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] instance.save() [ 1941.916157] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1941.916157] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] updates, result = self.indirection_api.object_action( [ 1941.916157] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1941.916157] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] return cctxt.call(context, 'object_action', objinst=objinst, [ 1941.916157] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1941.916157] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] result = self.transport._send( [ 1941.916491] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1941.916491] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] return self._driver.send(target, ctxt, message, [ 1941.916491] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1941.916491] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1941.916491] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1941.916491] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] raise result [ 1941.916491] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] nova.exception_Remote.InstanceNotFound_Remote: Instance e9b9f5db-afac-494e-9850-c0d82f26fc68 could not be found. [ 1941.916491] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Traceback (most recent call last): [ 1941.916491] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] [ 1941.916491] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1941.916491] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] return getattr(target, method)(*args, **kwargs) [ 1941.916491] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] [ 1941.916491] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1941.916796] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] return fn(self, *args, **kwargs) [ 1941.916796] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] [ 1941.916796] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1941.916796] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] old_ref, inst_ref = db.instance_update_and_get_original( [ 1941.916796] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] [ 1941.916796] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1941.916796] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] return f(*args, **kwargs) [ 1941.916796] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] [ 1941.916796] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1941.916796] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] with excutils.save_and_reraise_exception() as ectxt: [ 1941.916796] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] [ 1941.916796] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1941.916796] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] self.force_reraise() [ 1941.916796] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] [ 1941.916796] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1941.917129] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] raise self.value [ 1941.917129] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] [ 1941.917129] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1941.917129] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] return f(*args, **kwargs) [ 1941.917129] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] [ 1941.917129] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1941.917129] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] return f(context, *args, **kwargs) [ 1941.917129] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] [ 1941.917129] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1941.917129] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1941.917129] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] [ 1941.917129] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1941.917129] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] raise exception.InstanceNotFound(instance_id=uuid) [ 1941.917129] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] [ 1941.917129] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] nova.exception.InstanceNotFound: Instance e9b9f5db-afac-494e-9850-c0d82f26fc68 could not be found. [ 1941.917457] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] [ 1941.917457] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] [ 1941.917457] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] During handling of the above exception, another exception occurred: [ 1941.917457] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] [ 1941.917457] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Traceback (most recent call last): [ 1941.917457] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1941.917457] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] ret = obj(*args, **kwargs) [ 1941.917457] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1941.917457] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] exception_handler_v20(status_code, error_body) [ 1941.917457] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1941.917457] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] raise client_exc(message=error_message, [ 1941.917457] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1941.917457] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Neutron server returns request_ids: ['req-784ae45c-b37b-47c0-a38c-0561b93cab64'] [ 1941.917457] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] [ 1941.917754] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] During handling of the above exception, another exception occurred: [ 1941.917754] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] [ 1941.917754] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] Traceback (most recent call last): [ 1941.917754] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1941.917754] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] self._deallocate_network(context, instance, requested_networks) [ 1941.917754] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1941.917754] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] self.network_api.deallocate_for_instance( [ 1941.917754] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1941.917754] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] data = neutron.list_ports(**search_opts) [ 1941.917754] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1941.917754] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] ret = obj(*args, **kwargs) [ 1941.917754] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1941.917754] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] return self.list('ports', self.ports_path, retrieve_all, [ 1941.920043] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1941.920043] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] ret = obj(*args, **kwargs) [ 1941.920043] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1941.920043] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] for r in self._pagination(collection, path, **params): [ 1941.920043] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1941.920043] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] res = self.get(path, params=params) [ 1941.920043] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1941.920043] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] ret = obj(*args, **kwargs) [ 1941.920043] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1941.920043] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] return self.retry_request("GET", action, body=body, [ 1941.920043] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1941.920043] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] ret = obj(*args, **kwargs) [ 1941.920043] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1941.922018] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] return self.do_request(method, action, body=body, [ 1941.922018] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1941.922018] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] ret = obj(*args, **kwargs) [ 1941.922018] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1941.922018] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] self._handle_fault_response(status_code, replybody, resp) [ 1941.922018] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1941.922018] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] raise exception.Unauthorized() [ 1941.922018] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] nova.exception.Unauthorized: Not authorized. [ 1941.922018] env[59577]: ERROR nova.compute.manager [instance: e9b9f5db-afac-494e-9850-c0d82f26fc68] [ 1941.972269] env[59577]: DEBUG oslo_concurrency.lockutils [None req-f85bb0a9-03de-4edc-837e-08ae71a34253 tempest-ServerTagsTestJSON-1042488310 tempest-ServerTagsTestJSON-1042488310-project-member] Lock "e9b9f5db-afac-494e-9850-c0d82f26fc68" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 316.139s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1941.985255] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1941.986047] env[59577]: ERROR nova.compute.manager [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image d5e691af-5903-46f6-a589-e220c4e5798c. [ 1941.986047] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Traceback (most recent call last): [ 1941.986047] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1941.986047] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1941.986047] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1941.986047] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] result = getattr(controller, method)(*args, **kwargs) [ 1941.986047] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1941.986047] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] return self._get(image_id) [ 1941.986047] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1941.986047] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1941.986047] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1941.986326] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] resp, body = self.http_client.get(url, headers=header) [ 1941.986326] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1941.986326] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] return self.request(url, 'GET', **kwargs) [ 1941.986326] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1941.986326] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] return self._handle_response(resp) [ 1941.986326] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1941.986326] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] raise exc.from_response(resp, resp.content) [ 1941.986326] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1941.986326] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] [ 1941.986326] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] During handling of the above exception, another exception occurred: [ 1941.986326] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] [ 1941.986326] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Traceback (most recent call last): [ 1941.986622] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1941.986622] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] yield resources [ 1941.986622] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1941.986622] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] self.driver.spawn(context, instance, image_meta, [ 1941.986622] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1941.986622] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1941.986622] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1941.986622] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] self._fetch_image_if_missing(context, vi) [ 1941.986622] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1941.986622] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] image_fetch(context, vi, tmp_image_ds_loc) [ 1941.986622] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1941.986622] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] images.fetch_image( [ 1941.986622] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1941.986966] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] metadata = IMAGE_API.get(context, image_ref) [ 1941.986966] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1941.986966] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] return session.show(context, image_id, [ 1941.986966] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1941.986966] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] _reraise_translated_image_exception(image_id) [ 1941.986966] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1941.986966] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] raise new_exc.with_traceback(exc_trace) [ 1941.986966] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1941.986966] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1941.986966] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1941.986966] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] result = getattr(controller, method)(*args, **kwargs) [ 1941.986966] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1941.986966] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] return self._get(image_id) [ 1941.987269] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1941.987269] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1941.987269] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1941.987269] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] resp, body = self.http_client.get(url, headers=header) [ 1941.987269] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1941.987269] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] return self.request(url, 'GET', **kwargs) [ 1941.987269] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1941.987269] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] return self._handle_response(resp) [ 1941.987269] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1941.987269] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] raise exc.from_response(resp, resp.content) [ 1941.987269] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] nova.exception.ImageNotAuthorized: Not authorized for image d5e691af-5903-46f6-a589-e220c4e5798c. [ 1941.987269] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] [ 1941.987538] env[59577]: INFO nova.compute.manager [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Terminating instance [ 1941.988230] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1941.988453] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1941.989223] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9c59203f-1170-4699-adab-95df7ba76fdf {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1941.991457] env[59577]: DEBUG nova.compute.manager [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] [instance: 22339279-c381-4ccb-bba0-b0b554203e60] Starting instance... {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1941.997118] env[59577]: DEBUG nova.compute.manager [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Start destroying the instance on the hypervisor. {{(pid=59577) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1941.997118] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Destroying instance {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1941.997118] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6f80de9-804c-472b-818a-f42045a4754f {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1942.004156] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Unregistering the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1942.004410] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-1eadb56e-4b94-4ee2-90df-479ad00858b7 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1942.007406] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1942.007695] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59577) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1942.008726] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1aa4cdb2-c431-45c8-ad7e-94cecdc1a9a8 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1942.014248] env[59577]: DEBUG oslo_vmware.api [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Waiting for the task: (returnval){ [ 1942.014248] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]52c9432d-f461-d067-e727-c9f046c71040" [ 1942.014248] env[59577]: _type = "Task" [ 1942.014248] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1942.032598] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Preparing fetch location {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1942.033021] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Creating directory with path [datastore1] vmware_temp/be9d82e8-c964-4d83-b7eb-7dbfc451a0ea/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1942.035620] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-606a0957-2d4f-429f-a925-f96cd16f0e7f {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1942.049214] env[59577]: DEBUG oslo_concurrency.lockutils [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1942.049462] env[59577]: DEBUG oslo_concurrency.lockutils [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1942.051023] env[59577]: INFO nova.compute.claims [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] [instance: 22339279-c381-4ccb-bba0-b0b554203e60] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1942.055378] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Created directory with path [datastore1] vmware_temp/be9d82e8-c964-4d83-b7eb-7dbfc451a0ea/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1942.055564] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Fetch image to [datastore1] vmware_temp/be9d82e8-c964-4d83-b7eb-7dbfc451a0ea/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1942.055750] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to [datastore1] vmware_temp/be9d82e8-c964-4d83-b7eb-7dbfc451a0ea/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk on the data store datastore1 {{(pid=59577) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1942.056554] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10734b81-d1d2-4d4c-aa92-c3e4e36850c5 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1942.063543] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02a33f0a-5fef-45a8-baeb-143ac65a2ecd {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1942.074252] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9d6f542-541a-4792-8a25-ea8c94782d26 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1942.079147] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Unregistered the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1942.079351] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Deleting contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1942.079535] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Deleting the datastore file [datastore1] b9d0daac-02e6-4862-b3de-64223d5a4a76 {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1942.080103] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b8bf1cd8-2a03-4474-9449-37a9f987d95e {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1942.114970] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32c35f96-68be-447d-b4f8-e26066bbcdf5 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1942.114970] env[59577]: DEBUG oslo_vmware.api [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Waiting for the task: (returnval){ [ 1942.114970] env[59577]: value = "task-1933817" [ 1942.114970] env[59577]: _type = "Task" [ 1942.114970] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1942.123083] env[59577]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-fdd17cc9-f53d-44bc-b066-6d0fe996b166 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1942.127914] env[59577]: DEBUG oslo_vmware.api [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Task: {'id': task-1933817, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1942.156031] env[59577]: DEBUG nova.virt.vmwareapi.images [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to the data store datastore1 {{(pid=59577) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1942.247633] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf623d3e-e4ec-473e-8e0d-ba3b70e0fa3f {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1942.255572] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71a7efd6-4145-45b1-b3c1-73519a4f2bc6 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1942.284380] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1942.285134] env[59577]: ERROR nova.compute.manager [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image d5e691af-5903-46f6-a589-e220c4e5798c. [ 1942.285134] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Traceback (most recent call last): [ 1942.285134] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1942.285134] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1942.285134] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1942.285134] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] result = getattr(controller, method)(*args, **kwargs) [ 1942.285134] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1942.285134] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] return self._get(image_id) [ 1942.285134] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1942.285134] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1942.285134] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1942.285462] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] resp, body = self.http_client.get(url, headers=header) [ 1942.285462] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1942.285462] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] return self.request(url, 'GET', **kwargs) [ 1942.285462] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1942.285462] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] return self._handle_response(resp) [ 1942.285462] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1942.285462] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] raise exc.from_response(resp, resp.content) [ 1942.285462] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1942.285462] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] [ 1942.285462] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] During handling of the above exception, another exception occurred: [ 1942.285462] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] [ 1942.285462] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Traceback (most recent call last): [ 1942.285783] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1942.285783] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] yield resources [ 1942.285783] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1942.285783] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] self.driver.spawn(context, instance, image_meta, [ 1942.285783] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1942.285783] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1942.285783] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1942.285783] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] self._fetch_image_if_missing(context, vi) [ 1942.285783] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1942.285783] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] image_fetch(context, vi, tmp_image_ds_loc) [ 1942.285783] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1942.285783] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] images.fetch_image( [ 1942.285783] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1942.286146] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] metadata = IMAGE_API.get(context, image_ref) [ 1942.286146] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1942.286146] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] return session.show(context, image_id, [ 1942.286146] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1942.286146] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] _reraise_translated_image_exception(image_id) [ 1942.286146] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1942.286146] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] raise new_exc.with_traceback(exc_trace) [ 1942.286146] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1942.286146] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1942.286146] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1942.286146] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] result = getattr(controller, method)(*args, **kwargs) [ 1942.286146] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1942.286146] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] return self._get(image_id) [ 1942.286518] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1942.286518] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1942.286518] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1942.286518] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] resp, body = self.http_client.get(url, headers=header) [ 1942.286518] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1942.286518] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] return self.request(url, 'GET', **kwargs) [ 1942.286518] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1942.286518] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] return self._handle_response(resp) [ 1942.286518] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1942.286518] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] raise exc.from_response(resp, resp.content) [ 1942.286518] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] nova.exception.ImageNotAuthorized: Not authorized for image d5e691af-5903-46f6-a589-e220c4e5798c. [ 1942.286518] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] [ 1942.287020] env[59577]: INFO nova.compute.manager [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Terminating instance [ 1942.287353] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34b4500a-fe5b-428b-a181-4c3c2e6e7743 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1942.289764] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1942.289971] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1942.290569] env[59577]: DEBUG nova.compute.manager [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Start destroying the instance on the hypervisor. {{(pid=59577) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1942.290757] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Destroying instance {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1942.290986] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0a7e79bb-b449-4868-bb93-5da008260e89 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1942.294103] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15395242-6f55-4d10-a112-5674480baf0d {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1942.301545] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e800fd51-5ec9-42e7-a045-3030f5b2bf06 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1942.307373] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1942.307543] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59577) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1942.308470] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-46c55ada-2f40-4411-b231-17fcd3e9dcfe {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1942.318195] env[59577]: DEBUG nova.compute.provider_tree [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1942.321470] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Unregistering the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1942.322499] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-cccd7667-a7ef-439b-a7f1-6fc29234a624 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1942.323996] env[59577]: DEBUG oslo_vmware.api [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Waiting for the task: (returnval){ [ 1942.323996] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]52eea9e4-6ad4-3a95-f917-57e7086c2f73" [ 1942.323996] env[59577]: _type = "Task" [ 1942.323996] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1942.328047] env[59577]: DEBUG nova.scheduler.client.report [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1942.335818] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Preparing fetch location {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1942.336052] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Creating directory with path [datastore1] vmware_temp/d287dd64-b4be-4dd2-9a01-6c8b5c58e4b1/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1942.336254] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cee36b12-2080-4a90-bd82-e30a066acbc6 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1942.342630] env[59577]: DEBUG oslo_concurrency.lockutils [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.293s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1942.342976] env[59577]: DEBUG nova.compute.manager [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] [instance: 22339279-c381-4ccb-bba0-b0b554203e60] Start building networks asynchronously for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1942.347189] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Created directory with path [datastore1] vmware_temp/d287dd64-b4be-4dd2-9a01-6c8b5c58e4b1/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1942.347379] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Fetch image to [datastore1] vmware_temp/d287dd64-b4be-4dd2-9a01-6c8b5c58e4b1/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1942.347542] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to [datastore1] vmware_temp/d287dd64-b4be-4dd2-9a01-6c8b5c58e4b1/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk on the data store datastore1 {{(pid=59577) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1942.348466] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5863362-d4ea-4db1-bd1e-192e1a9e9129 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1942.354864] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c726311-af53-4e23-a3b7-40ece0beb360 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1942.364179] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4b1cab6-0f0c-4225-a811-28b01e893252 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1942.395794] env[59577]: DEBUG nova.compute.utils [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Using /dev/sd instead of None {{(pid=59577) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1942.397528] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1719615-a723-49ee-8311-ab9d456a7ce5 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1942.401525] env[59577]: DEBUG nova.compute.manager [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] [instance: 22339279-c381-4ccb-bba0-b0b554203e60] Allocating IP information in the background. {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1942.401701] env[59577]: DEBUG nova.network.neutron [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] [instance: 22339279-c381-4ccb-bba0-b0b554203e60] allocate_for_instance() {{(pid=59577) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1942.403425] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Unregistered the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1942.403620] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Deleting contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1942.403794] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Deleting the datastore file [datastore1] e7945a83-b063-42c4-9991-7f1e0545361d {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1942.404529] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-02616bbd-2b77-4fe3-8f8d-d6c70b74eb38 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1942.409786] env[59577]: DEBUG nova.compute.manager [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] [instance: 22339279-c381-4ccb-bba0-b0b554203e60] Start building block device mappings for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1942.412794] env[59577]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0877f4eb-86bb-4f60-a2d3-d4bdce3b6e5e {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1942.419042] env[59577]: DEBUG oslo_vmware.api [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Waiting for the task: (returnval){ [ 1942.419042] env[59577]: value = "task-1933819" [ 1942.419042] env[59577]: _type = "Task" [ 1942.419042] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1942.424218] env[59577]: DEBUG oslo_vmware.api [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Task: {'id': task-1933819, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1942.432765] env[59577]: DEBUG nova.virt.vmwareapi.images [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to the data store datastore1 {{(pid=59577) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1942.465732] env[59577]: DEBUG nova.policy [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '002d6bada6b3419585fcc67da0129a67', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e87cb8b9e19a447da79735de48c85852', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59577) authorize /opt/stack/nova/nova/policy.py:203}} [ 1942.485516] env[59577]: DEBUG nova.compute.manager [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] [instance: 22339279-c381-4ccb-bba0-b0b554203e60] Start spawning the instance on the hypervisor. {{(pid=59577) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1942.534494] env[59577]: DEBUG nova.virt.hardware [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-12-17T13:42:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-12-17T13:42:33Z,direct_url=,disk_format='vmdk',id=d5e691af-5903-46f6-a589-e220c4e5798c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5cabf3c5743c484db7095e0ffc0e5d73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-12-17T13:42:34Z,virtual_size=,visibility=), allow threads: False {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1942.534494] env[59577]: DEBUG nova.virt.hardware [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Flavor limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1942.534494] env[59577]: DEBUG nova.virt.hardware [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Image limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1942.534787] env[59577]: DEBUG nova.virt.hardware [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Flavor pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1942.534787] env[59577]: DEBUG nova.virt.hardware [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Image pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1942.534787] env[59577]: DEBUG nova.virt.hardware [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1942.534910] env[59577]: DEBUG nova.virt.hardware [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1942.535088] env[59577]: DEBUG nova.virt.hardware [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1942.535241] env[59577]: DEBUG nova.virt.hardware [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Got 1 possible topologies {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1942.535400] env[59577]: DEBUG nova.virt.hardware [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1942.535565] env[59577]: DEBUG nova.virt.hardware [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1942.536649] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b77b3e7-fe74-4f7a-8e0e-7f200bf2dd81 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1942.540152] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1942.540900] env[59577]: ERROR nova.compute.manager [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image d5e691af-5903-46f6-a589-e220c4e5798c. [ 1942.540900] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Traceback (most recent call last): [ 1942.540900] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1942.540900] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1942.540900] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1942.540900] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] result = getattr(controller, method)(*args, **kwargs) [ 1942.540900] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1942.540900] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] return self._get(image_id) [ 1942.540900] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1942.540900] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1942.540900] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1942.541272] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] resp, body = self.http_client.get(url, headers=header) [ 1942.541272] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1942.541272] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] return self.request(url, 'GET', **kwargs) [ 1942.541272] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1942.541272] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] return self._handle_response(resp) [ 1942.541272] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1942.541272] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] raise exc.from_response(resp, resp.content) [ 1942.541272] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1942.541272] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] [ 1942.541272] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] During handling of the above exception, another exception occurred: [ 1942.541272] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] [ 1942.541272] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Traceback (most recent call last): [ 1942.541571] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1942.541571] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] yield resources [ 1942.541571] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1942.541571] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] self.driver.spawn(context, instance, image_meta, [ 1942.541571] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1942.541571] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1942.541571] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1942.541571] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] self._fetch_image_if_missing(context, vi) [ 1942.541571] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1942.541571] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] image_fetch(context, vi, tmp_image_ds_loc) [ 1942.541571] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1942.541571] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] images.fetch_image( [ 1942.541571] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1942.541943] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] metadata = IMAGE_API.get(context, image_ref) [ 1942.541943] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1942.541943] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] return session.show(context, image_id, [ 1942.541943] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1942.541943] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] _reraise_translated_image_exception(image_id) [ 1942.541943] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1942.541943] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] raise new_exc.with_traceback(exc_trace) [ 1942.541943] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1942.541943] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1942.541943] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1942.541943] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] result = getattr(controller, method)(*args, **kwargs) [ 1942.541943] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1942.541943] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] return self._get(image_id) [ 1942.542310] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1942.542310] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1942.542310] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1942.542310] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] resp, body = self.http_client.get(url, headers=header) [ 1942.542310] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1942.542310] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] return self.request(url, 'GET', **kwargs) [ 1942.542310] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1942.542310] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] return self._handle_response(resp) [ 1942.542310] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1942.542310] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] raise exc.from_response(resp, resp.content) [ 1942.542310] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] nova.exception.ImageNotAuthorized: Not authorized for image d5e691af-5903-46f6-a589-e220c4e5798c. [ 1942.542310] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] [ 1942.542631] env[59577]: INFO nova.compute.manager [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Terminating instance [ 1942.542746] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1942.542943] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1942.543537] env[59577]: DEBUG nova.compute.manager [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Start destroying the instance on the hypervisor. {{(pid=59577) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1942.543723] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Destroying instance {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1942.544301] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e3f6b1ba-4404-46f1-8034-108ad6ea2660 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1942.546875] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4a0a612-11a5-4b75-96e6-4cac2a920593 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1942.552725] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16cdcd09-f313-4c92-9927-657641509425 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1942.559036] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Unregistering the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1942.559273] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1942.559431] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59577) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1942.560358] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-359ec1da-0c1a-414a-a321-17d4f2c33b66 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1942.561655] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1b3928b8-bab5-4178-8acf-61f21cf89652 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1942.573875] env[59577]: DEBUG oslo_vmware.api [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Waiting for the task: (returnval){ [ 1942.573875] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]524ab7d3-a8a3-f9b7-4139-a962a3e692bc" [ 1942.573875] env[59577]: _type = "Task" [ 1942.573875] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1942.581083] env[59577]: DEBUG oslo_vmware.api [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]524ab7d3-a8a3-f9b7-4139-a962a3e692bc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1942.624394] env[59577]: DEBUG oslo_vmware.api [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Task: {'id': task-1933817, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.065819} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1942.624640] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Deleted the datastore file {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1942.624820] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Deleted contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1942.624996] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Instance destroyed {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1942.625183] env[59577]: INFO nova.compute.manager [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Took 0.63 seconds to destroy the instance on the hypervisor. [ 1942.627152] env[59577]: DEBUG nova.compute.claims [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Aborting claim: {{(pid=59577) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1942.627319] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1942.627529] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1942.650913] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.023s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1942.651686] env[59577]: DEBUG nova.compute.utils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Instance b9d0daac-02e6-4862-b3de-64223d5a4a76 could not be found. {{(pid=59577) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1942.653395] env[59577]: DEBUG nova.compute.manager [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Instance disappeared during build. {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1942.653626] env[59577]: DEBUG nova.compute.manager [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Unplugging VIFs for instance {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1942.653815] env[59577]: DEBUG nova.compute.manager [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1942.654057] env[59577]: DEBUG nova.compute.manager [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Deallocating network for instance {{(pid=59577) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1942.654240] env[59577]: DEBUG nova.network.neutron [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] deallocate_for_instance() {{(pid=59577) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1942.661497] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Unregistered the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1942.661789] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Deleting contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1942.661988] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Deleting the datastore file [datastore1] 1ebb8847-1932-4ed6-8e56-bf48952cfc9c {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1942.662456] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1896c9a8-f1a9-4792-8f76-366e5c39e388 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1942.668520] env[59577]: DEBUG oslo_vmware.api [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Waiting for the task: (returnval){ [ 1942.668520] env[59577]: value = "task-1933821" [ 1942.668520] env[59577]: _type = "Task" [ 1942.668520] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1942.677097] env[59577]: DEBUG oslo_vmware.api [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Task: {'id': task-1933821, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1942.701303] env[59577]: DEBUG nova.compute.manager [req-7128457e-fbf6-4632-b3b9-45b4381447c0 req-dc5e60c3-6d18-4afc-8fc4-f47fcd2f7f61 service nova] [instance: 56259797-6883-437c-8942-5beca0e1ef7b] Received event network-vif-plugged-63f4eeb6-1ad0-4da2-a8ac-8ff233044606 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1942.701518] env[59577]: DEBUG oslo_concurrency.lockutils [req-7128457e-fbf6-4632-b3b9-45b4381447c0 req-dc5e60c3-6d18-4afc-8fc4-f47fcd2f7f61 service nova] Acquiring lock "56259797-6883-437c-8942-5beca0e1ef7b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1942.701725] env[59577]: DEBUG oslo_concurrency.lockutils [req-7128457e-fbf6-4632-b3b9-45b4381447c0 req-dc5e60c3-6d18-4afc-8fc4-f47fcd2f7f61 service nova] Lock "56259797-6883-437c-8942-5beca0e1ef7b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1942.701889] env[59577]: DEBUG oslo_concurrency.lockutils [req-7128457e-fbf6-4632-b3b9-45b4381447c0 req-dc5e60c3-6d18-4afc-8fc4-f47fcd2f7f61 service nova] Lock "56259797-6883-437c-8942-5beca0e1ef7b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1942.702165] env[59577]: DEBUG nova.compute.manager [req-7128457e-fbf6-4632-b3b9-45b4381447c0 req-dc5e60c3-6d18-4afc-8fc4-f47fcd2f7f61 service nova] [instance: 56259797-6883-437c-8942-5beca0e1ef7b] No waiting events found dispatching network-vif-plugged-63f4eeb6-1ad0-4da2-a8ac-8ff233044606 {{(pid=59577) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1942.702238] env[59577]: WARNING nova.compute.manager [req-7128457e-fbf6-4632-b3b9-45b4381447c0 req-dc5e60c3-6d18-4afc-8fc4-f47fcd2f7f61 service nova] [instance: 56259797-6883-437c-8942-5beca0e1ef7b] Received unexpected event network-vif-plugged-63f4eeb6-1ad0-4da2-a8ac-8ff233044606 for instance with vm_state building and task_state spawning. [ 1942.754579] env[59577]: DEBUG neutronclient.v2_0.client [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59577) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1942.756296] env[59577]: ERROR nova.compute.manager [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1942.756296] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Traceback (most recent call last): [ 1942.756296] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1942.756296] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1942.756296] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1942.756296] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] result = getattr(controller, method)(*args, **kwargs) [ 1942.756296] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1942.756296] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] return self._get(image_id) [ 1942.756296] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1942.756296] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1942.756296] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1942.756296] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] resp, body = self.http_client.get(url, headers=header) [ 1942.756675] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1942.756675] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] return self.request(url, 'GET', **kwargs) [ 1942.756675] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1942.756675] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] return self._handle_response(resp) [ 1942.756675] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1942.756675] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] raise exc.from_response(resp, resp.content) [ 1942.756675] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1942.756675] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] [ 1942.756675] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] During handling of the above exception, another exception occurred: [ 1942.756675] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] [ 1942.756675] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Traceback (most recent call last): [ 1942.756675] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1942.757031] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] self.driver.spawn(context, instance, image_meta, [ 1942.757031] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1942.757031] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1942.757031] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1942.757031] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] self._fetch_image_if_missing(context, vi) [ 1942.757031] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1942.757031] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] image_fetch(context, vi, tmp_image_ds_loc) [ 1942.757031] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1942.757031] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] images.fetch_image( [ 1942.757031] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1942.757031] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] metadata = IMAGE_API.get(context, image_ref) [ 1942.757031] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1942.757031] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] return session.show(context, image_id, [ 1942.757366] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1942.757366] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] _reraise_translated_image_exception(image_id) [ 1942.757366] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1942.757366] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] raise new_exc.with_traceback(exc_trace) [ 1942.757366] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1942.757366] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1942.757366] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1942.757366] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] result = getattr(controller, method)(*args, **kwargs) [ 1942.757366] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1942.757366] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] return self._get(image_id) [ 1942.757366] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1942.757366] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1942.757366] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1942.757681] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] resp, body = self.http_client.get(url, headers=header) [ 1942.757681] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1942.757681] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] return self.request(url, 'GET', **kwargs) [ 1942.757681] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1942.757681] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] return self._handle_response(resp) [ 1942.757681] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1942.757681] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] raise exc.from_response(resp, resp.content) [ 1942.757681] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] nova.exception.ImageNotAuthorized: Not authorized for image d5e691af-5903-46f6-a589-e220c4e5798c. [ 1942.757681] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] [ 1942.757681] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] During handling of the above exception, another exception occurred: [ 1942.757681] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] [ 1942.757681] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Traceback (most recent call last): [ 1942.757681] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1942.757954] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] self._build_and_run_instance(context, instance, image, [ 1942.757954] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1942.757954] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] with excutils.save_and_reraise_exception(): [ 1942.757954] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1942.757954] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] self.force_reraise() [ 1942.757954] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1942.757954] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] raise self.value [ 1942.757954] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1942.757954] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] with self.rt.instance_claim(context, instance, node, allocs, [ 1942.757954] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1942.757954] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] self.abort() [ 1942.757954] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1942.757954] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1942.758260] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1942.758260] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] return f(*args, **kwargs) [ 1942.758260] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1942.758260] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] self._unset_instance_host_and_node(instance) [ 1942.758260] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1942.758260] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] instance.save() [ 1942.758260] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1942.758260] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] updates, result = self.indirection_api.object_action( [ 1942.758260] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1942.758260] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] return cctxt.call(context, 'object_action', objinst=objinst, [ 1942.758260] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1942.758260] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] result = self.transport._send( [ 1942.758538] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1942.758538] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] return self._driver.send(target, ctxt, message, [ 1942.758538] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1942.758538] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1942.758538] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1942.758538] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] raise result [ 1942.758538] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] nova.exception_Remote.InstanceNotFound_Remote: Instance b9d0daac-02e6-4862-b3de-64223d5a4a76 could not be found. [ 1942.758538] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Traceback (most recent call last): [ 1942.758538] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] [ 1942.758538] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1942.758538] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] return getattr(target, method)(*args, **kwargs) [ 1942.758538] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] [ 1942.758538] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1942.758844] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] return fn(self, *args, **kwargs) [ 1942.758844] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] [ 1942.758844] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1942.758844] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] old_ref, inst_ref = db.instance_update_and_get_original( [ 1942.758844] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] [ 1942.758844] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1942.758844] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] return f(*args, **kwargs) [ 1942.758844] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] [ 1942.758844] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1942.758844] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] with excutils.save_and_reraise_exception() as ectxt: [ 1942.758844] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] [ 1942.758844] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1942.758844] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] self.force_reraise() [ 1942.758844] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] [ 1942.758844] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1942.759173] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] raise self.value [ 1942.759173] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] [ 1942.759173] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1942.759173] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] return f(*args, **kwargs) [ 1942.759173] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] [ 1942.759173] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1942.759173] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] return f(context, *args, **kwargs) [ 1942.759173] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] [ 1942.759173] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1942.759173] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1942.759173] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] [ 1942.759173] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1942.759173] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] raise exception.InstanceNotFound(instance_id=uuid) [ 1942.759173] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] [ 1942.759173] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] nova.exception.InstanceNotFound: Instance b9d0daac-02e6-4862-b3de-64223d5a4a76 could not be found. [ 1942.759572] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] [ 1942.759572] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] [ 1942.759572] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] During handling of the above exception, another exception occurred: [ 1942.759572] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] [ 1942.759572] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Traceback (most recent call last): [ 1942.759572] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1942.759572] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] ret = obj(*args, **kwargs) [ 1942.759572] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1942.759572] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] exception_handler_v20(status_code, error_body) [ 1942.759572] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1942.759572] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] raise client_exc(message=error_message, [ 1942.759572] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1942.759572] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Neutron server returns request_ids: ['req-e1938c6b-abe5-4650-86e6-c37cd985fd70'] [ 1942.759572] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] [ 1942.759898] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] During handling of the above exception, another exception occurred: [ 1942.759898] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] [ 1942.759898] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] Traceback (most recent call last): [ 1942.759898] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1942.759898] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] self._deallocate_network(context, instance, requested_networks) [ 1942.759898] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1942.759898] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] self.network_api.deallocate_for_instance( [ 1942.759898] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1942.759898] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] data = neutron.list_ports(**search_opts) [ 1942.759898] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1942.759898] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] ret = obj(*args, **kwargs) [ 1942.759898] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1942.759898] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] return self.list('ports', self.ports_path, retrieve_all, [ 1942.760419] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1942.760419] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] ret = obj(*args, **kwargs) [ 1942.760419] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1942.760419] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] for r in self._pagination(collection, path, **params): [ 1942.760419] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1942.760419] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] res = self.get(path, params=params) [ 1942.760419] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1942.760419] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] ret = obj(*args, **kwargs) [ 1942.760419] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1942.760419] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] return self.retry_request("GET", action, body=body, [ 1942.760419] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1942.760419] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] ret = obj(*args, **kwargs) [ 1942.760419] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1942.760816] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] return self.do_request(method, action, body=body, [ 1942.760816] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1942.760816] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] ret = obj(*args, **kwargs) [ 1942.760816] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1942.760816] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] self._handle_fault_response(status_code, replybody, resp) [ 1942.760816] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1942.760816] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] raise exception.Unauthorized() [ 1942.760816] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] nova.exception.Unauthorized: Not authorized. [ 1942.760816] env[59577]: ERROR nova.compute.manager [instance: b9d0daac-02e6-4862-b3de-64223d5a4a76] [ 1942.778440] env[59577]: DEBUG oslo_concurrency.lockutils [None req-98576864-bee2-4653-b90b-e30c5b4cd40f tempest-MultipleCreateTestJSON-91958749 tempest-MultipleCreateTestJSON-91958749-project-member] Lock "b9d0daac-02e6-4862-b3de-64223d5a4a76" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 313.490s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1942.790140] env[59577]: DEBUG nova.compute.manager [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] [instance: 9c96e4d7-30d3-44fc-b8d0-14271dc19ce3] Starting instance... {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1942.838335] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1942.838629] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1942.844051] env[59577]: INFO nova.compute.claims [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] [instance: 9c96e4d7-30d3-44fc-b8d0-14271dc19ce3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1942.885197] env[59577]: DEBUG nova.network.neutron [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: 56259797-6883-437c-8942-5beca0e1ef7b] Successfully updated port: 63f4eeb6-1ad0-4da2-a8ac-8ff233044606 {{(pid=59577) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1942.892618] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Acquiring lock "refresh_cache-56259797-6883-437c-8942-5beca0e1ef7b" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1942.892842] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Acquired lock "refresh_cache-56259797-6883-437c-8942-5beca0e1ef7b" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1942.892903] env[59577]: DEBUG nova.network.neutron [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: 56259797-6883-437c-8942-5beca0e1ef7b] Building network info cache for instance {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1942.927561] env[59577]: DEBUG oslo_vmware.api [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Task: {'id': task-1933819, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066582} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1942.927561] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Deleted the datastore file {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1942.927742] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Deleted contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1942.927877] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Instance destroyed {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1942.928066] env[59577]: INFO nova.compute.manager [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Took 0.64 seconds to destroy the instance on the hypervisor. [ 1942.930290] env[59577]: DEBUG nova.compute.claims [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Aborting claim: {{(pid=59577) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1942.930290] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1942.966107] env[59577]: DEBUG nova.network.neutron [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: 56259797-6883-437c-8942-5beca0e1ef7b] Instance cache missing network info. {{(pid=59577) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1943.027522] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c938c5b8-e9c8-4c6d-9447-4d274c235d9d {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1943.035621] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b42de0d-745d-496f-b652-83c78bb43099 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1943.069425] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9aadf91-4970-45b6-ad5f-e2e3b8dffa68 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1943.079320] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7403649-fa97-4190-a39b-0cf728a4f826 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1943.093541] env[59577]: DEBUG nova.compute.provider_tree [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1943.097802] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Preparing fetch location {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1943.098072] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Creating directory with path [datastore1] vmware_temp/df6f2ef7-1be7-43f7-a732-287bd40e34ad/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1943.098301] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-232f391a-9a6c-41a0-aedd-ff46e04c2e75 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1943.102898] env[59577]: DEBUG nova.scheduler.client.report [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1943.111381] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Created directory with path [datastore1] vmware_temp/df6f2ef7-1be7-43f7-a732-287bd40e34ad/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1943.111381] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Fetch image to [datastore1] vmware_temp/df6f2ef7-1be7-43f7-a732-287bd40e34ad/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1943.111381] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to [datastore1] vmware_temp/df6f2ef7-1be7-43f7-a732-287bd40e34ad/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk on the data store datastore1 {{(pid=59577) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1943.112012] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36724336-5f06-4872-bb06-d235464cd392 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1943.116792] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.278s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1943.120071] env[59577]: DEBUG nova.compute.manager [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] [instance: 9c96e4d7-30d3-44fc-b8d0-14271dc19ce3] Start building networks asynchronously for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1943.120474] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.190s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1943.125810] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c0fc209-692a-46fd-a68d-17431339022b {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1943.137577] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6418ac8-d58c-4b80-8bed-5c76424ccfdc {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1943.174536] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.054s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1943.175390] env[59577]: DEBUG nova.compute.utils [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Instance e7945a83-b063-42c4-9991-7f1e0545361d could not be found. {{(pid=59577) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1943.178387] env[59577]: DEBUG nova.compute.utils [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Using /dev/sd instead of None {{(pid=59577) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1943.181809] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ff6be56-e8bc-4071-b087-11fc5efee746 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1943.185058] env[59577]: DEBUG nova.compute.manager [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Instance disappeared during build. {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1943.185058] env[59577]: DEBUG nova.compute.manager [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Unplugging VIFs for instance {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1943.185224] env[59577]: DEBUG nova.compute.manager [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1943.185275] env[59577]: DEBUG nova.compute.manager [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Deallocating network for instance {{(pid=59577) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1943.186130] env[59577]: DEBUG nova.network.neutron [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] deallocate_for_instance() {{(pid=59577) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1943.187616] env[59577]: DEBUG nova.compute.manager [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] [instance: 9c96e4d7-30d3-44fc-b8d0-14271dc19ce3] Allocating IP information in the background. {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1943.187780] env[59577]: DEBUG nova.network.neutron [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] [instance: 9c96e4d7-30d3-44fc-b8d0-14271dc19ce3] allocate_for_instance() {{(pid=59577) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1943.189554] env[59577]: DEBUG nova.compute.manager [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] [instance: 9c96e4d7-30d3-44fc-b8d0-14271dc19ce3] Start building block device mappings for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1943.197667] env[59577]: DEBUG oslo_vmware.api [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Task: {'id': task-1933821, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07237} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1943.199215] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Deleted the datastore file {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1943.199399] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Deleted contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1943.199566] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Instance destroyed {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1943.199729] env[59577]: INFO nova.compute.manager [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Took 0.66 seconds to destroy the instance on the hypervisor. [ 1943.202173] env[59577]: DEBUG nova.compute.claims [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Aborting claim: {{(pid=59577) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1943.202173] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1943.202173] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1943.204963] env[59577]: DEBUG nova.network.neutron [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] [instance: 22339279-c381-4ccb-bba0-b0b554203e60] Successfully created port: c089d5d4-e798-41e1-bac8-058f3a1fe9de {{(pid=59577) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1943.209038] env[59577]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-596f00cc-6b5e-428e-ae49-8f8548edbba6 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1943.228723] env[59577]: DEBUG nova.virt.vmwareapi.images [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to the data store datastore1 {{(pid=59577) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1943.240466] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.038s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1943.241149] env[59577]: DEBUG nova.compute.utils [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Instance 1ebb8847-1932-4ed6-8e56-bf48952cfc9c could not be found. {{(pid=59577) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1943.243418] env[59577]: DEBUG nova.compute.manager [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Instance disappeared during build. {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1943.243589] env[59577]: DEBUG nova.compute.manager [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Unplugging VIFs for instance {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1943.243753] env[59577]: DEBUG nova.compute.manager [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1943.244053] env[59577]: DEBUG nova.compute.manager [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Deallocating network for instance {{(pid=59577) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1943.244089] env[59577]: DEBUG nova.network.neutron [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] deallocate_for_instance() {{(pid=59577) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1943.281831] env[59577]: DEBUG nova.compute.manager [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] [instance: 9c96e4d7-30d3-44fc-b8d0-14271dc19ce3] Start spawning the instance on the hypervisor. {{(pid=59577) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1943.298447] env[59577]: DEBUG nova.policy [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bcd191e0471f48bbb84e0a7746672205', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3c4c39d2e9c649b797455441eedf22d0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59577) authorize /opt/stack/nova/nova/policy.py:203}} [ 1943.352787] env[59577]: DEBUG oslo_vmware.rw_handles [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/df6f2ef7-1be7-43f7-a732-287bd40e34ad/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59577) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1943.410497] env[59577]: DEBUG nova.virt.hardware [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-12-17T13:42:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-12-17T13:42:33Z,direct_url=,disk_format='vmdk',id=d5e691af-5903-46f6-a589-e220c4e5798c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5cabf3c5743c484db7095e0ffc0e5d73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-12-17T13:42:34Z,virtual_size=,visibility=), allow threads: False {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1943.410724] env[59577]: DEBUG nova.virt.hardware [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Flavor limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1943.410956] env[59577]: DEBUG nova.virt.hardware [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Image limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1943.411124] env[59577]: DEBUG nova.virt.hardware [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Flavor pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1943.411289] env[59577]: DEBUG nova.virt.hardware [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Image pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1943.411447] env[59577]: DEBUG nova.virt.hardware [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1943.411659] env[59577]: DEBUG nova.virt.hardware [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1943.411820] env[59577]: DEBUG nova.virt.hardware [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1943.411984] env[59577]: DEBUG nova.virt.hardware [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Got 1 possible topologies {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1943.412166] env[59577]: DEBUG nova.virt.hardware [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1943.412338] env[59577]: DEBUG nova.virt.hardware [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1943.414660] env[59577]: DEBUG nova.network.neutron [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: 56259797-6883-437c-8942-5beca0e1ef7b] Updating instance_info_cache with network_info: [{"id": "63f4eeb6-1ad0-4da2-a8ac-8ff233044606", "address": "fa:16:3e:c4:1b:0d", "network": {"id": "a61ea66b-b7b2-4c86-976d-641129187c28", "bridge": "br-int", "label": "tempest-ServersTestJSON-427195933-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8a8bdbc34699435bbde5622db4df613f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d6a6f7bb-6106-4cfd-9aef-b85628d0cefa", "external-id": "nsx-vlan-transportzone-194", "segmentation_id": 194, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap63f4eeb6-1a", "ovs_interfaceid": "63f4eeb6-1ad0-4da2-a8ac-8ff233044606", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1943.416111] env[59577]: DEBUG neutronclient.v2_0.client [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59577) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1943.417580] env[59577]: ERROR nova.compute.manager [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1943.417580] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Traceback (most recent call last): [ 1943.417580] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1943.417580] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1943.417580] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1943.417580] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] result = getattr(controller, method)(*args, **kwargs) [ 1943.417580] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1943.417580] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] return self._get(image_id) [ 1943.417580] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1943.417580] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1943.417580] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1943.417580] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] resp, body = self.http_client.get(url, headers=header) [ 1943.417921] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1943.417921] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] return self.request(url, 'GET', **kwargs) [ 1943.417921] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1943.417921] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] return self._handle_response(resp) [ 1943.417921] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1943.417921] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] raise exc.from_response(resp, resp.content) [ 1943.417921] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1943.417921] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] [ 1943.417921] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] During handling of the above exception, another exception occurred: [ 1943.417921] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] [ 1943.417921] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Traceback (most recent call last): [ 1943.417921] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1943.418241] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] self.driver.spawn(context, instance, image_meta, [ 1943.418241] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1943.418241] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1943.418241] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1943.418241] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] self._fetch_image_if_missing(context, vi) [ 1943.418241] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1943.418241] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] image_fetch(context, vi, tmp_image_ds_loc) [ 1943.418241] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1943.418241] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] images.fetch_image( [ 1943.418241] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1943.418241] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] metadata = IMAGE_API.get(context, image_ref) [ 1943.418241] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1943.418241] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] return session.show(context, image_id, [ 1943.418610] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1943.418610] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] _reraise_translated_image_exception(image_id) [ 1943.418610] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1943.418610] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] raise new_exc.with_traceback(exc_trace) [ 1943.418610] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1943.418610] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1943.418610] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1943.418610] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] result = getattr(controller, method)(*args, **kwargs) [ 1943.418610] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1943.418610] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] return self._get(image_id) [ 1943.418610] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1943.418610] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1943.418610] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1943.418894] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] resp, body = self.http_client.get(url, headers=header) [ 1943.418894] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1943.418894] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] return self.request(url, 'GET', **kwargs) [ 1943.418894] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1943.418894] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] return self._handle_response(resp) [ 1943.418894] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1943.418894] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] raise exc.from_response(resp, resp.content) [ 1943.418894] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] nova.exception.ImageNotAuthorized: Not authorized for image d5e691af-5903-46f6-a589-e220c4e5798c. [ 1943.418894] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] [ 1943.418894] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] During handling of the above exception, another exception occurred: [ 1943.418894] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] [ 1943.418894] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Traceback (most recent call last): [ 1943.418894] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1943.419185] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] self._build_and_run_instance(context, instance, image, [ 1943.419185] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1943.419185] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] with excutils.save_and_reraise_exception(): [ 1943.419185] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1943.419185] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] self.force_reraise() [ 1943.419185] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1943.419185] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] raise self.value [ 1943.419185] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1943.419185] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] with self.rt.instance_claim(context, instance, node, allocs, [ 1943.419185] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1943.419185] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] self.abort() [ 1943.419185] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1943.419185] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1943.419488] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1943.419488] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] return f(*args, **kwargs) [ 1943.419488] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1943.419488] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] self._unset_instance_host_and_node(instance) [ 1943.419488] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1943.419488] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] instance.save() [ 1943.419488] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1943.419488] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] updates, result = self.indirection_api.object_action( [ 1943.419488] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1943.419488] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] return cctxt.call(context, 'object_action', objinst=objinst, [ 1943.419488] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1943.419488] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] result = self.transport._send( [ 1943.419762] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1943.419762] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] return self._driver.send(target, ctxt, message, [ 1943.419762] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1943.419762] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1943.419762] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1943.419762] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] raise result [ 1943.419762] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] nova.exception_Remote.InstanceNotFound_Remote: Instance e7945a83-b063-42c4-9991-7f1e0545361d could not be found. [ 1943.419762] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Traceback (most recent call last): [ 1943.419762] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] [ 1943.419762] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1943.419762] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] return getattr(target, method)(*args, **kwargs) [ 1943.419762] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] [ 1943.419762] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1943.420056] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] return fn(self, *args, **kwargs) [ 1943.420056] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] [ 1943.420056] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1943.420056] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] old_ref, inst_ref = db.instance_update_and_get_original( [ 1943.420056] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] [ 1943.420056] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1943.420056] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] return f(*args, **kwargs) [ 1943.420056] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] [ 1943.420056] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1943.420056] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] with excutils.save_and_reraise_exception() as ectxt: [ 1943.420056] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] [ 1943.420056] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1943.420056] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] self.force_reraise() [ 1943.420056] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] [ 1943.420056] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1943.420372] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] raise self.value [ 1943.420372] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] [ 1943.420372] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1943.420372] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] return f(*args, **kwargs) [ 1943.420372] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] [ 1943.420372] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1943.420372] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] return f(context, *args, **kwargs) [ 1943.420372] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] [ 1943.420372] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1943.420372] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1943.420372] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] [ 1943.420372] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1943.420372] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] raise exception.InstanceNotFound(instance_id=uuid) [ 1943.420372] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] [ 1943.420372] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] nova.exception.InstanceNotFound: Instance e7945a83-b063-42c4-9991-7f1e0545361d could not be found. [ 1943.420740] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] [ 1943.420740] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] [ 1943.420740] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] During handling of the above exception, another exception occurred: [ 1943.420740] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] [ 1943.420740] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Traceback (most recent call last): [ 1943.420740] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1943.420740] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] ret = obj(*args, **kwargs) [ 1943.420740] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1943.420740] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] exception_handler_v20(status_code, error_body) [ 1943.420740] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1943.420740] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] raise client_exc(message=error_message, [ 1943.420740] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1943.420740] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Neutron server returns request_ids: ['req-eb1052b3-3514-4ccc-9f56-4a7a905709ca'] [ 1943.420740] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] [ 1943.421060] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] During handling of the above exception, another exception occurred: [ 1943.421060] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] [ 1943.421060] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] Traceback (most recent call last): [ 1943.421060] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1943.421060] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] self._deallocate_network(context, instance, requested_networks) [ 1943.421060] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1943.421060] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] self.network_api.deallocate_for_instance( [ 1943.421060] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1943.421060] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] data = neutron.list_ports(**search_opts) [ 1943.421060] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1943.421060] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] ret = obj(*args, **kwargs) [ 1943.421060] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1943.421060] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] return self.list('ports', self.ports_path, retrieve_all, [ 1943.421342] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1943.421342] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] ret = obj(*args, **kwargs) [ 1943.421342] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1943.421342] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] for r in self._pagination(collection, path, **params): [ 1943.421342] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1943.421342] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] res = self.get(path, params=params) [ 1943.421342] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1943.421342] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] ret = obj(*args, **kwargs) [ 1943.421342] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1943.421342] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] return self.retry_request("GET", action, body=body, [ 1943.421342] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1943.421342] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] ret = obj(*args, **kwargs) [ 1943.421342] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1943.421627] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] return self.do_request(method, action, body=body, [ 1943.421627] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1943.421627] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] ret = obj(*args, **kwargs) [ 1943.421627] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1943.421627] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] self._handle_fault_response(status_code, replybody, resp) [ 1943.421627] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1943.421627] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] raise exception.Unauthorized() [ 1943.421627] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] nova.exception.Unauthorized: Not authorized. [ 1943.421627] env[59577]: ERROR nova.compute.manager [instance: e7945a83-b063-42c4-9991-7f1e0545361d] [ 1943.421627] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66a3194f-5a03-4596-9703-65c6fcdac6bd {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1943.424139] env[59577]: DEBUG oslo_vmware.rw_handles [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Completed reading data from the image iterator. {{(pid=59577) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1943.424312] env[59577]: DEBUG oslo_vmware.rw_handles [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/df6f2ef7-1be7-43f7-a732-287bd40e34ad/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59577) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1943.430508] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26056bc7-1f03-4dba-bdf5-6c4be34972c8 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1943.436790] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Releasing lock "refresh_cache-56259797-6883-437c-8942-5beca0e1ef7b" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1943.437100] env[59577]: DEBUG nova.compute.manager [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: 56259797-6883-437c-8942-5beca0e1ef7b] Instance network_info: |[{"id": "63f4eeb6-1ad0-4da2-a8ac-8ff233044606", "address": "fa:16:3e:c4:1b:0d", "network": {"id": "a61ea66b-b7b2-4c86-976d-641129187c28", "bridge": "br-int", "label": "tempest-ServersTestJSON-427195933-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8a8bdbc34699435bbde5622db4df613f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d6a6f7bb-6106-4cfd-9aef-b85628d0cefa", "external-id": "nsx-vlan-transportzone-194", "segmentation_id": 194, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap63f4eeb6-1a", "ovs_interfaceid": "63f4eeb6-1ad0-4da2-a8ac-8ff233044606", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1943.439661] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: 56259797-6883-437c-8942-5beca0e1ef7b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c4:1b:0d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd6a6f7bb-6106-4cfd-9aef-b85628d0cefa', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '63f4eeb6-1ad0-4da2-a8ac-8ff233044606', 'vif_model': 'vmxnet3'}] {{(pid=59577) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1943.446893] env[59577]: DEBUG oslo.service.loopingcall [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59577) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1943.447672] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 56259797-6883-437c-8942-5beca0e1ef7b] Creating VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1943.448110] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0c09ca57-03e3-4837-92f3-4ba170f11bff {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1943.470449] env[59577]: DEBUG oslo_concurrency.lockutils [None req-5901e476-9a1f-4e4d-a3fa-655c36efc0d9 tempest-ServerRescueTestJSON-1338088932 tempest-ServerRescueTestJSON-1338088932-project-member] Lock "e7945a83-b063-42c4-9991-7f1e0545361d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 311.348s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1943.476957] env[59577]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1943.476957] env[59577]: value = "task-1933822" [ 1943.476957] env[59577]: _type = "Task" [ 1943.476957] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1943.482092] env[59577]: DEBUG nova.compute.manager [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] [instance: e2c0fb3c-6cee-4be4-a368-0fa86a07ff88] Starting instance... {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1943.487588] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933822, 'name': CreateVM_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1943.498487] env[59577]: DEBUG neutronclient.v2_0.client [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59577) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1943.499650] env[59577]: ERROR nova.compute.manager [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1943.499650] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Traceback (most recent call last): [ 1943.499650] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1943.499650] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1943.499650] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1943.499650] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] result = getattr(controller, method)(*args, **kwargs) [ 1943.499650] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1943.499650] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] return self._get(image_id) [ 1943.499650] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1943.499650] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1943.499650] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1943.499650] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] resp, body = self.http_client.get(url, headers=header) [ 1943.500065] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1943.500065] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] return self.request(url, 'GET', **kwargs) [ 1943.500065] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1943.500065] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] return self._handle_response(resp) [ 1943.500065] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1943.500065] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] raise exc.from_response(resp, resp.content) [ 1943.500065] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1943.500065] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] [ 1943.500065] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] During handling of the above exception, another exception occurred: [ 1943.500065] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] [ 1943.500065] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Traceback (most recent call last): [ 1943.500065] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1943.500407] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] self.driver.spawn(context, instance, image_meta, [ 1943.500407] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1943.500407] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1943.500407] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1943.500407] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] self._fetch_image_if_missing(context, vi) [ 1943.500407] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1943.500407] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] image_fetch(context, vi, tmp_image_ds_loc) [ 1943.500407] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1943.500407] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] images.fetch_image( [ 1943.500407] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1943.500407] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] metadata = IMAGE_API.get(context, image_ref) [ 1943.500407] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1943.500407] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] return session.show(context, image_id, [ 1943.500749] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1943.500749] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] _reraise_translated_image_exception(image_id) [ 1943.500749] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1943.500749] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] raise new_exc.with_traceback(exc_trace) [ 1943.500749] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1943.500749] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1943.500749] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1943.500749] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] result = getattr(controller, method)(*args, **kwargs) [ 1943.500749] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1943.500749] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] return self._get(image_id) [ 1943.500749] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1943.500749] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1943.500749] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1943.501035] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] resp, body = self.http_client.get(url, headers=header) [ 1943.501035] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1943.501035] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] return self.request(url, 'GET', **kwargs) [ 1943.501035] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1943.501035] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] return self._handle_response(resp) [ 1943.501035] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1943.501035] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] raise exc.from_response(resp, resp.content) [ 1943.501035] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] nova.exception.ImageNotAuthorized: Not authorized for image d5e691af-5903-46f6-a589-e220c4e5798c. [ 1943.501035] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] [ 1943.501035] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] During handling of the above exception, another exception occurred: [ 1943.501035] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] [ 1943.501035] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Traceback (most recent call last): [ 1943.501035] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1943.501317] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] self._build_and_run_instance(context, instance, image, [ 1943.501317] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1943.501317] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] with excutils.save_and_reraise_exception(): [ 1943.501317] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1943.501317] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] self.force_reraise() [ 1943.501317] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1943.501317] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] raise self.value [ 1943.501317] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1943.501317] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] with self.rt.instance_claim(context, instance, node, allocs, [ 1943.501317] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1943.501317] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] self.abort() [ 1943.501317] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1943.501317] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1943.501663] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1943.501663] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] return f(*args, **kwargs) [ 1943.501663] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1943.501663] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] self._unset_instance_host_and_node(instance) [ 1943.501663] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1943.501663] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] instance.save() [ 1943.501663] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1943.501663] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] updates, result = self.indirection_api.object_action( [ 1943.501663] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1943.501663] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] return cctxt.call(context, 'object_action', objinst=objinst, [ 1943.501663] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1943.501663] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] result = self.transport._send( [ 1943.501956] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1943.501956] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] return self._driver.send(target, ctxt, message, [ 1943.501956] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1943.501956] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1943.501956] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1943.501956] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] raise result [ 1943.501956] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] nova.exception_Remote.InstanceNotFound_Remote: Instance 1ebb8847-1932-4ed6-8e56-bf48952cfc9c could not be found. [ 1943.501956] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Traceback (most recent call last): [ 1943.501956] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] [ 1943.501956] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1943.501956] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] return getattr(target, method)(*args, **kwargs) [ 1943.501956] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] [ 1943.501956] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1943.502272] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] return fn(self, *args, **kwargs) [ 1943.502272] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] [ 1943.502272] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1943.502272] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] old_ref, inst_ref = db.instance_update_and_get_original( [ 1943.502272] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] [ 1943.502272] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1943.502272] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] return f(*args, **kwargs) [ 1943.502272] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] [ 1943.502272] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1943.502272] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] with excutils.save_and_reraise_exception() as ectxt: [ 1943.502272] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] [ 1943.502272] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1943.502272] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] self.force_reraise() [ 1943.502272] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] [ 1943.502272] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1943.502597] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] raise self.value [ 1943.502597] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] [ 1943.502597] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1943.502597] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] return f(*args, **kwargs) [ 1943.502597] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] [ 1943.502597] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1943.502597] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] return f(context, *args, **kwargs) [ 1943.502597] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] [ 1943.502597] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1943.502597] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1943.502597] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] [ 1943.502597] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1943.502597] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] raise exception.InstanceNotFound(instance_id=uuid) [ 1943.502597] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] [ 1943.502597] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] nova.exception.InstanceNotFound: Instance 1ebb8847-1932-4ed6-8e56-bf48952cfc9c could not be found. [ 1943.502920] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] [ 1943.502920] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] [ 1943.502920] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] During handling of the above exception, another exception occurred: [ 1943.502920] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] [ 1943.502920] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Traceback (most recent call last): [ 1943.502920] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1943.502920] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] ret = obj(*args, **kwargs) [ 1943.502920] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1943.502920] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] exception_handler_v20(status_code, error_body) [ 1943.502920] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1943.502920] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] raise client_exc(message=error_message, [ 1943.502920] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1943.502920] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Neutron server returns request_ids: ['req-3254266c-86fd-4803-b511-426dd3f4e956'] [ 1943.502920] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] [ 1943.503856] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] During handling of the above exception, another exception occurred: [ 1943.503856] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] [ 1943.503856] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] Traceback (most recent call last): [ 1943.503856] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1943.503856] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] self._deallocate_network(context, instance, requested_networks) [ 1943.503856] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1943.503856] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] self.network_api.deallocate_for_instance( [ 1943.503856] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1943.503856] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] data = neutron.list_ports(**search_opts) [ 1943.503856] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1943.503856] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] ret = obj(*args, **kwargs) [ 1943.503856] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1943.503856] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] return self.list('ports', self.ports_path, retrieve_all, [ 1943.504154] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1943.504154] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] ret = obj(*args, **kwargs) [ 1943.504154] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1943.504154] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] for r in self._pagination(collection, path, **params): [ 1943.504154] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1943.504154] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] res = self.get(path, params=params) [ 1943.504154] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1943.504154] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] ret = obj(*args, **kwargs) [ 1943.504154] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1943.504154] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] return self.retry_request("GET", action, body=body, [ 1943.504154] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1943.504154] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] ret = obj(*args, **kwargs) [ 1943.504154] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1943.504452] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] return self.do_request(method, action, body=body, [ 1943.504452] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1943.504452] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] ret = obj(*args, **kwargs) [ 1943.504452] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1943.504452] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] self._handle_fault_response(status_code, replybody, resp) [ 1943.504452] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1943.504452] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] raise exception.Unauthorized() [ 1943.504452] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] nova.exception.Unauthorized: Not authorized. [ 1943.504452] env[59577]: ERROR nova.compute.manager [instance: 1ebb8847-1932-4ed6-8e56-bf48952cfc9c] [ 1943.523923] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ac3136be-e95a-46de-ba5f-44ed5ef7c464 tempest-ServersNegativeTestJSON-870846203 tempest-ServersNegativeTestJSON-870846203-project-member] Lock "1ebb8847-1932-4ed6-8e56-bf48952cfc9c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 308.825s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1943.532033] env[59577]: DEBUG oslo_concurrency.lockutils [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1943.532307] env[59577]: DEBUG oslo_concurrency.lockutils [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1943.534057] env[59577]: INFO nova.compute.claims [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] [instance: e2c0fb3c-6cee-4be4-a368-0fa86a07ff88] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1943.537131] env[59577]: DEBUG nova.compute.manager [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] [instance: 3d63cb9b-3c20-4c34-a96e-29e3dcea65a7] Starting instance... {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1943.594159] env[59577]: DEBUG oslo_concurrency.lockutils [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1943.626163] env[59577]: DEBUG nova.network.neutron [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] [instance: 9c96e4d7-30d3-44fc-b8d0-14271dc19ce3] Successfully created port: 6b402131-56b2-48ab-bdc9-ab0e0d2a4709 {{(pid=59577) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1943.713428] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f5d4d4b-8a9a-4cf5-8a6b-e8bef147e5e0 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1943.723018] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3cdc64f7-71cf-47fc-8d0e-5b64aa718f7b {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1943.752954] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a3af55a-ce99-41cc-8160-4dbdf326227b {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1943.760658] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a015cdfb-da95-4e64-b6f2-0faa468f9a3b {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1943.776889] env[59577]: DEBUG nova.compute.provider_tree [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1943.786277] env[59577]: DEBUG nova.scheduler.client.report [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1943.801069] env[59577]: DEBUG oslo_concurrency.lockutils [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.269s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1943.801650] env[59577]: DEBUG nova.compute.manager [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] [instance: e2c0fb3c-6cee-4be4-a368-0fa86a07ff88] Start building networks asynchronously for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1943.804172] env[59577]: DEBUG oslo_concurrency.lockutils [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.210s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1943.805661] env[59577]: INFO nova.compute.claims [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] [instance: 3d63cb9b-3c20-4c34-a96e-29e3dcea65a7] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1943.835805] env[59577]: DEBUG nova.compute.utils [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] Using /dev/sd instead of None {{(pid=59577) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1943.837175] env[59577]: DEBUG nova.compute.manager [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] [instance: e2c0fb3c-6cee-4be4-a368-0fa86a07ff88] Not allocating networking since 'none' was specified. {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 1943.848608] env[59577]: DEBUG nova.compute.manager [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] [instance: e2c0fb3c-6cee-4be4-a368-0fa86a07ff88] Start building block device mappings for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1943.940450] env[59577]: DEBUG nova.compute.manager [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] [instance: e2c0fb3c-6cee-4be4-a368-0fa86a07ff88] Start spawning the instance on the hypervisor. {{(pid=59577) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1943.950385] env[59577]: DEBUG nova.network.neutron [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] [instance: 22339279-c381-4ccb-bba0-b0b554203e60] Successfully updated port: c089d5d4-e798-41e1-bac8-058f3a1fe9de {{(pid=59577) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1943.970378] env[59577]: DEBUG oslo_concurrency.lockutils [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Acquiring lock "refresh_cache-22339279-c381-4ccb-bba0-b0b554203e60" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1943.970613] env[59577]: DEBUG oslo_concurrency.lockutils [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Acquired lock "refresh_cache-22339279-c381-4ccb-bba0-b0b554203e60" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1943.970868] env[59577]: DEBUG nova.network.neutron [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] [instance: 22339279-c381-4ccb-bba0-b0b554203e60] Building network info cache for instance {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1943.983622] env[59577]: DEBUG nova.virt.hardware [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-12-17T13:42:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-12-17T13:42:33Z,direct_url=,disk_format='vmdk',id=d5e691af-5903-46f6-a589-e220c4e5798c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5cabf3c5743c484db7095e0ffc0e5d73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-12-17T13:42:34Z,virtual_size=,visibility=), allow threads: False {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1943.983857] env[59577]: DEBUG nova.virt.hardware [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] Flavor limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1943.984038] env[59577]: DEBUG nova.virt.hardware [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] Image limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1943.984318] env[59577]: DEBUG nova.virt.hardware [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] Flavor pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1943.984495] env[59577]: DEBUG nova.virt.hardware [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] Image pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1943.984647] env[59577]: DEBUG nova.virt.hardware [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1943.984855] env[59577]: DEBUG nova.virt.hardware [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1943.985053] env[59577]: DEBUG nova.virt.hardware [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1943.985242] env[59577]: DEBUG nova.virt.hardware [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] Got 1 possible topologies {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1943.985621] env[59577]: DEBUG nova.virt.hardware [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1943.985621] env[59577]: DEBUG nova.virt.hardware [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1943.986939] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e4aa3a8-4155-470e-a12b-7ab2f1033e5a {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1943.999155] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933822, 'name': CreateVM_Task, 'duration_secs': 0.315444} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1944.001155] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 56259797-6883-437c-8942-5beca0e1ef7b] Created VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1944.002488] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1944.002612] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1944.002928] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1944.004554] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2fd13cdb-0b10-4031-a5c8-4969432ee958 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1944.008973] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4aa4d85f-81ef-4ef2-9342-47a314b732cc {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1944.012802] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9cbe4b87-4ff4-4287-a225-cab9df732f15 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1944.023685] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] [instance: e2c0fb3c-6cee-4be4-a368-0fa86a07ff88] Instance VIF info [] {{(pid=59577) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1944.029623] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] Creating folder: Project (88048fc3a01844d8ad854788fadf5bc4). Parent ref: group-v398749. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1944.031277] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b09771df-a974-4c97-8cd0-e994c9a0d9a9 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1944.032983] env[59577]: DEBUG oslo_vmware.api [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Waiting for the task: (returnval){ [ 1944.032983] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]5215e351-ff43-de44-1c78-dcf2f5b2b88d" [ 1944.032983] env[59577]: _type = "Task" [ 1944.032983] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1944.038713] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9cc68185-7d1c-4009-8fb3-e36b6342b204 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1944.042674] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] Created folder: Project (88048fc3a01844d8ad854788fadf5bc4) in parent group-v398749. [ 1944.042792] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] Creating folder: Instances. Parent ref: group-v398799. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1944.043517] env[59577]: DEBUG nova.network.neutron [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] [instance: 22339279-c381-4ccb-bba0-b0b554203e60] Instance cache missing network info. {{(pid=59577) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1944.045679] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ef27f821-fe0f-4f94-bbdb-5b39e292a281 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1944.075191] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1944.075418] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: 56259797-6883-437c-8942-5beca0e1ef7b] Processing image d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1944.075631] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1944.079030] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eea9ab1a-17b4-4a68-b331-1b7bf44cedb5 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1944.082906] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] Created folder: Instances in parent group-v398799. [ 1944.083146] env[59577]: DEBUG oslo.service.loopingcall [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59577) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1944.083326] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e2c0fb3c-6cee-4be4-a368-0fa86a07ff88] Creating VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1944.083515] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-86642cb6-463f-416c-90c6-1522d5fd61c9 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1944.099108] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29b7dc0c-89c6-4b40-a00d-fe470bc57646 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1944.103524] env[59577]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1944.103524] env[59577]: value = "task-1933825" [ 1944.103524] env[59577]: _type = "Task" [ 1944.103524] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1944.115012] env[59577]: DEBUG nova.compute.provider_tree [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1944.119818] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933825, 'name': CreateVM_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1944.124865] env[59577]: DEBUG nova.scheduler.client.report [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1944.139916] env[59577]: DEBUG oslo_concurrency.lockutils [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.336s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1944.140438] env[59577]: DEBUG nova.compute.manager [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] [instance: 3d63cb9b-3c20-4c34-a96e-29e3dcea65a7] Start building networks asynchronously for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1944.179216] env[59577]: DEBUG nova.compute.utils [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Using /dev/sd instead of None {{(pid=59577) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1944.180732] env[59577]: DEBUG nova.compute.manager [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] [instance: 3d63cb9b-3c20-4c34-a96e-29e3dcea65a7] Allocating IP information in the background. {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1944.180914] env[59577]: DEBUG nova.network.neutron [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] [instance: 3d63cb9b-3c20-4c34-a96e-29e3dcea65a7] allocate_for_instance() {{(pid=59577) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1944.189774] env[59577]: DEBUG nova.compute.manager [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] [instance: 3d63cb9b-3c20-4c34-a96e-29e3dcea65a7] Start building block device mappings for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1944.254466] env[59577]: DEBUG nova.compute.manager [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] [instance: 3d63cb9b-3c20-4c34-a96e-29e3dcea65a7] Start spawning the instance on the hypervisor. {{(pid=59577) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1944.284251] env[59577]: DEBUG nova.virt.hardware [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-12-17T13:42:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-12-17T13:42:33Z,direct_url=,disk_format='vmdk',id=d5e691af-5903-46f6-a589-e220c4e5798c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5cabf3c5743c484db7095e0ffc0e5d73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-12-17T13:42:34Z,virtual_size=,visibility=), allow threads: False {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1944.284573] env[59577]: DEBUG nova.virt.hardware [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Flavor limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1944.284789] env[59577]: DEBUG nova.virt.hardware [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Image limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1944.285072] env[59577]: DEBUG nova.virt.hardware [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Flavor pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1944.285411] env[59577]: DEBUG nova.virt.hardware [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Image pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1944.285709] env[59577]: DEBUG nova.virt.hardware [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1944.286087] env[59577]: DEBUG nova.virt.hardware [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1944.286238] env[59577]: DEBUG nova.virt.hardware [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1944.286494] env[59577]: DEBUG nova.virt.hardware [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Got 1 possible topologies {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1944.286788] env[59577]: DEBUG nova.virt.hardware [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1944.287085] env[59577]: DEBUG nova.virt.hardware [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1944.288065] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f175a54e-cc28-4897-8162-eb1fd9116ac4 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1944.292591] env[59577]: DEBUG nova.policy [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e8d3a0b5b4e94dfaae0ad84c3a9ca385', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8d52da3118854611a3e2a8e5899b6ad9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59577) authorize /opt/stack/nova/nova/policy.py:203}} [ 1944.299489] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7ada8c4-224b-419d-9d56-922b426fa8e0 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1944.387442] env[59577]: DEBUG nova.network.neutron [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] [instance: 22339279-c381-4ccb-bba0-b0b554203e60] Updating instance_info_cache with network_info: [{"id": "c089d5d4-e798-41e1-bac8-058f3a1fe9de", "address": "fa:16:3e:c2:f3:bd", "network": {"id": "3ba4c698-3ba8-452d-8b6f-9370193d0029", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1033918078-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e87cb8b9e19a447da79735de48c85852", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "eaf3dfa2-fa01-4d4d-8ecd-a9bc74d90ec2", "external-id": "nsx-vlan-transportzone-546", "segmentation_id": 546, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc089d5d4-e7", "ovs_interfaceid": "c089d5d4-e798-41e1-bac8-058f3a1fe9de", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1944.402951] env[59577]: DEBUG oslo_concurrency.lockutils [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Releasing lock "refresh_cache-22339279-c381-4ccb-bba0-b0b554203e60" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1944.403201] env[59577]: DEBUG nova.compute.manager [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] [instance: 22339279-c381-4ccb-bba0-b0b554203e60] Instance network_info: |[{"id": "c089d5d4-e798-41e1-bac8-058f3a1fe9de", "address": "fa:16:3e:c2:f3:bd", "network": {"id": "3ba4c698-3ba8-452d-8b6f-9370193d0029", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1033918078-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e87cb8b9e19a447da79735de48c85852", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "eaf3dfa2-fa01-4d4d-8ecd-a9bc74d90ec2", "external-id": "nsx-vlan-transportzone-546", "segmentation_id": 546, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc089d5d4-e7", "ovs_interfaceid": "c089d5d4-e798-41e1-bac8-058f3a1fe9de", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1944.403568] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] [instance: 22339279-c381-4ccb-bba0-b0b554203e60] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c2:f3:bd', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'eaf3dfa2-fa01-4d4d-8ecd-a9bc74d90ec2', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c089d5d4-e798-41e1-bac8-058f3a1fe9de', 'vif_model': 'vmxnet3'}] {{(pid=59577) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1944.411908] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Creating folder: Project (e87cb8b9e19a447da79735de48c85852). Parent ref: group-v398749. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1944.413444] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-aeb0ee4d-b64d-47ca-ba61-9c35bfc450cd {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1944.416015] env[59577]: DEBUG nova.compute.manager [req-57985ce9-2b6b-4772-b76a-6fe89d01def6 req-52523727-f3ae-46de-94e3-fe284e5aef49 service nova] [instance: 9c96e4d7-30d3-44fc-b8d0-14271dc19ce3] Received event network-vif-plugged-6b402131-56b2-48ab-bdc9-ab0e0d2a4709 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1944.416285] env[59577]: DEBUG oslo_concurrency.lockutils [req-57985ce9-2b6b-4772-b76a-6fe89d01def6 req-52523727-f3ae-46de-94e3-fe284e5aef49 service nova] Acquiring lock "9c96e4d7-30d3-44fc-b8d0-14271dc19ce3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1944.416560] env[59577]: DEBUG oslo_concurrency.lockutils [req-57985ce9-2b6b-4772-b76a-6fe89d01def6 req-52523727-f3ae-46de-94e3-fe284e5aef49 service nova] Lock "9c96e4d7-30d3-44fc-b8d0-14271dc19ce3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1944.416812] env[59577]: DEBUG oslo_concurrency.lockutils [req-57985ce9-2b6b-4772-b76a-6fe89d01def6 req-52523727-f3ae-46de-94e3-fe284e5aef49 service nova] Lock "9c96e4d7-30d3-44fc-b8d0-14271dc19ce3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1944.417051] env[59577]: DEBUG nova.compute.manager [req-57985ce9-2b6b-4772-b76a-6fe89d01def6 req-52523727-f3ae-46de-94e3-fe284e5aef49 service nova] [instance: 9c96e4d7-30d3-44fc-b8d0-14271dc19ce3] No waiting events found dispatching network-vif-plugged-6b402131-56b2-48ab-bdc9-ab0e0d2a4709 {{(pid=59577) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1944.417282] env[59577]: WARNING nova.compute.manager [req-57985ce9-2b6b-4772-b76a-6fe89d01def6 req-52523727-f3ae-46de-94e3-fe284e5aef49 service nova] [instance: 9c96e4d7-30d3-44fc-b8d0-14271dc19ce3] Received unexpected event network-vif-plugged-6b402131-56b2-48ab-bdc9-ab0e0d2a4709 for instance with vm_state building and task_state spawning. [ 1944.426568] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Created folder: Project (e87cb8b9e19a447da79735de48c85852) in parent group-v398749. [ 1944.427142] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Creating folder: Instances. Parent ref: group-v398802. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1944.427426] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0a6db95d-7324-4664-b8e5-f4fecfab190c {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1944.438753] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Created folder: Instances in parent group-v398802. [ 1944.439064] env[59577]: DEBUG oslo.service.loopingcall [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59577) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1944.439306] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 22339279-c381-4ccb-bba0-b0b554203e60] Creating VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1944.440110] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-d6ba20dc-d239-4f58-8ec0-5ca09a59302a {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1944.459140] env[59577]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1944.459140] env[59577]: value = "task-1933828" [ 1944.459140] env[59577]: _type = "Task" [ 1944.459140] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1944.459981] env[59577]: DEBUG nova.network.neutron [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] [instance: 9c96e4d7-30d3-44fc-b8d0-14271dc19ce3] Successfully updated port: 6b402131-56b2-48ab-bdc9-ab0e0d2a4709 {{(pid=59577) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1944.469286] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933828, 'name': CreateVM_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1944.470317] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Acquiring lock "refresh_cache-9c96e4d7-30d3-44fc-b8d0-14271dc19ce3" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1944.470530] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Acquired lock "refresh_cache-9c96e4d7-30d3-44fc-b8d0-14271dc19ce3" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1944.470731] env[59577]: DEBUG nova.network.neutron [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] [instance: 9c96e4d7-30d3-44fc-b8d0-14271dc19ce3] Building network info cache for instance {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1944.505419] env[59577]: DEBUG nova.network.neutron [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] [instance: 9c96e4d7-30d3-44fc-b8d0-14271dc19ce3] Instance cache missing network info. {{(pid=59577) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1944.613828] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933825, 'name': CreateVM_Task, 'duration_secs': 0.255873} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1944.613986] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e2c0fb3c-6cee-4be4-a368-0fa86a07ff88] Created VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1944.614388] env[59577]: DEBUG oslo_concurrency.lockutils [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1944.614559] env[59577]: DEBUG oslo_concurrency.lockutils [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1944.614895] env[59577]: DEBUG oslo_concurrency.lockutils [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1944.615155] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e50ccecc-1bb1-436f-a2f4-eb52d6f6c09e {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1944.620103] env[59577]: DEBUG oslo_vmware.api [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] Waiting for the task: (returnval){ [ 1944.620103] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]5283a6dd-6316-1782-5221-226fea39876f" [ 1944.620103] env[59577]: _type = "Task" [ 1944.620103] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1944.627986] env[59577]: DEBUG oslo_vmware.api [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]5283a6dd-6316-1782-5221-226fea39876f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1944.740133] env[59577]: DEBUG nova.compute.manager [req-2332c6b3-b41b-42a0-9ba2-96d07a907729 req-f47112b7-f4f1-45fb-b3a0-7faf2547dd31 service nova] [instance: 56259797-6883-437c-8942-5beca0e1ef7b] Received event network-changed-63f4eeb6-1ad0-4da2-a8ac-8ff233044606 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1944.740399] env[59577]: DEBUG nova.compute.manager [req-2332c6b3-b41b-42a0-9ba2-96d07a907729 req-f47112b7-f4f1-45fb-b3a0-7faf2547dd31 service nova] [instance: 56259797-6883-437c-8942-5beca0e1ef7b] Refreshing instance network info cache due to event network-changed-63f4eeb6-1ad0-4da2-a8ac-8ff233044606. {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1944.740669] env[59577]: DEBUG oslo_concurrency.lockutils [req-2332c6b3-b41b-42a0-9ba2-96d07a907729 req-f47112b7-f4f1-45fb-b3a0-7faf2547dd31 service nova] Acquiring lock "refresh_cache-56259797-6883-437c-8942-5beca0e1ef7b" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1944.740869] env[59577]: DEBUG oslo_concurrency.lockutils [req-2332c6b3-b41b-42a0-9ba2-96d07a907729 req-f47112b7-f4f1-45fb-b3a0-7faf2547dd31 service nova] Acquired lock "refresh_cache-56259797-6883-437c-8942-5beca0e1ef7b" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1944.741173] env[59577]: DEBUG nova.network.neutron [req-2332c6b3-b41b-42a0-9ba2-96d07a907729 req-f47112b7-f4f1-45fb-b3a0-7faf2547dd31 service nova] [instance: 56259797-6883-437c-8942-5beca0e1ef7b] Refreshing network info cache for port 63f4eeb6-1ad0-4da2-a8ac-8ff233044606 {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1944.743343] env[59577]: DEBUG nova.network.neutron [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] [instance: 9c96e4d7-30d3-44fc-b8d0-14271dc19ce3] Updating instance_info_cache with network_info: [{"id": "6b402131-56b2-48ab-bdc9-ab0e0d2a4709", "address": "fa:16:3e:76:93:16", "network": {"id": "ad2922e8-aa8b-402d-af0f-beb0734c5e2a", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-104285320-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3c4c39d2e9c649b797455441eedf22d0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "298bb8ef-4765-494c-b157-7a349218bd1e", "external-id": "nsx-vlan-transportzone-905", "segmentation_id": 905, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6b402131-56", "ovs_interfaceid": "6b402131-56b2-48ab-bdc9-ab0e0d2a4709", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1944.753395] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Releasing lock "refresh_cache-9c96e4d7-30d3-44fc-b8d0-14271dc19ce3" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1944.753822] env[59577]: DEBUG nova.compute.manager [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] [instance: 9c96e4d7-30d3-44fc-b8d0-14271dc19ce3] Instance network_info: |[{"id": "6b402131-56b2-48ab-bdc9-ab0e0d2a4709", "address": "fa:16:3e:76:93:16", "network": {"id": "ad2922e8-aa8b-402d-af0f-beb0734c5e2a", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-104285320-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3c4c39d2e9c649b797455441eedf22d0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "298bb8ef-4765-494c-b157-7a349218bd1e", "external-id": "nsx-vlan-transportzone-905", "segmentation_id": 905, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6b402131-56", "ovs_interfaceid": "6b402131-56b2-48ab-bdc9-ab0e0d2a4709", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1944.754303] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] [instance: 9c96e4d7-30d3-44fc-b8d0-14271dc19ce3] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:76:93:16', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '298bb8ef-4765-494c-b157-7a349218bd1e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6b402131-56b2-48ab-bdc9-ab0e0d2a4709', 'vif_model': 'vmxnet3'}] {{(pid=59577) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1944.762915] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Creating folder: Project (3c4c39d2e9c649b797455441eedf22d0). Parent ref: group-v398749. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1944.763671] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-68ef1240-be8c-43d2-884e-fc426edee1ba {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1944.774253] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Created folder: Project (3c4c39d2e9c649b797455441eedf22d0) in parent group-v398749. [ 1944.774623] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Creating folder: Instances. Parent ref: group-v398805. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1944.774995] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2caf10d9-8f6b-4519-8ab5-ca9a96a713c7 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1944.789735] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Created folder: Instances in parent group-v398805. [ 1944.789735] env[59577]: DEBUG oslo.service.loopingcall [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59577) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1944.789911] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9c96e4d7-30d3-44fc-b8d0-14271dc19ce3] Creating VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1944.790015] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1baa0479-4a3e-476f-aed2-35a8f6a42d52 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1944.812048] env[59577]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1944.812048] env[59577]: value = "task-1933831" [ 1944.812048] env[59577]: _type = "Task" [ 1944.812048] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1944.819965] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933831, 'name': CreateVM_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1944.896281] env[59577]: DEBUG nova.network.neutron [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] [instance: 3d63cb9b-3c20-4c34-a96e-29e3dcea65a7] Successfully created port: a67d68db-4bf8-431b-b399-abdf86da1d91 {{(pid=59577) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1944.970257] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933828, 'name': CreateVM_Task, 'duration_secs': 0.404874} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1944.970452] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 22339279-c381-4ccb-bba0-b0b554203e60] Created VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1944.971242] env[59577]: DEBUG oslo_concurrency.lockutils [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1945.099205] env[59577]: DEBUG nova.network.neutron [req-2332c6b3-b41b-42a0-9ba2-96d07a907729 req-f47112b7-f4f1-45fb-b3a0-7faf2547dd31 service nova] [instance: 56259797-6883-437c-8942-5beca0e1ef7b] Updated VIF entry in instance network info cache for port 63f4eeb6-1ad0-4da2-a8ac-8ff233044606. {{(pid=59577) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1945.099558] env[59577]: DEBUG nova.network.neutron [req-2332c6b3-b41b-42a0-9ba2-96d07a907729 req-f47112b7-f4f1-45fb-b3a0-7faf2547dd31 service nova] [instance: 56259797-6883-437c-8942-5beca0e1ef7b] Updating instance_info_cache with network_info: [{"id": "63f4eeb6-1ad0-4da2-a8ac-8ff233044606", "address": "fa:16:3e:c4:1b:0d", "network": {"id": "a61ea66b-b7b2-4c86-976d-641129187c28", "bridge": "br-int", "label": "tempest-ServersTestJSON-427195933-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8a8bdbc34699435bbde5622db4df613f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d6a6f7bb-6106-4cfd-9aef-b85628d0cefa", "external-id": "nsx-vlan-transportzone-194", "segmentation_id": 194, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap63f4eeb6-1a", "ovs_interfaceid": "63f4eeb6-1ad0-4da2-a8ac-8ff233044606", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1945.111301] env[59577]: DEBUG oslo_concurrency.lockutils [req-2332c6b3-b41b-42a0-9ba2-96d07a907729 req-f47112b7-f4f1-45fb-b3a0-7faf2547dd31 service nova] Releasing lock "refresh_cache-56259797-6883-437c-8942-5beca0e1ef7b" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1945.111587] env[59577]: DEBUG nova.compute.manager [req-2332c6b3-b41b-42a0-9ba2-96d07a907729 req-f47112b7-f4f1-45fb-b3a0-7faf2547dd31 service nova] [instance: 22339279-c381-4ccb-bba0-b0b554203e60] Received event network-vif-plugged-c089d5d4-e798-41e1-bac8-058f3a1fe9de {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1945.111803] env[59577]: DEBUG oslo_concurrency.lockutils [req-2332c6b3-b41b-42a0-9ba2-96d07a907729 req-f47112b7-f4f1-45fb-b3a0-7faf2547dd31 service nova] Acquiring lock "22339279-c381-4ccb-bba0-b0b554203e60-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1945.112033] env[59577]: DEBUG oslo_concurrency.lockutils [req-2332c6b3-b41b-42a0-9ba2-96d07a907729 req-f47112b7-f4f1-45fb-b3a0-7faf2547dd31 service nova] Lock "22339279-c381-4ccb-bba0-b0b554203e60-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1945.112303] env[59577]: DEBUG oslo_concurrency.lockutils [req-2332c6b3-b41b-42a0-9ba2-96d07a907729 req-f47112b7-f4f1-45fb-b3a0-7faf2547dd31 service nova] Lock "22339279-c381-4ccb-bba0-b0b554203e60-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1945.112405] env[59577]: DEBUG nova.compute.manager [req-2332c6b3-b41b-42a0-9ba2-96d07a907729 req-f47112b7-f4f1-45fb-b3a0-7faf2547dd31 service nova] [instance: 22339279-c381-4ccb-bba0-b0b554203e60] No waiting events found dispatching network-vif-plugged-c089d5d4-e798-41e1-bac8-058f3a1fe9de {{(pid=59577) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1945.112596] env[59577]: WARNING nova.compute.manager [req-2332c6b3-b41b-42a0-9ba2-96d07a907729 req-f47112b7-f4f1-45fb-b3a0-7faf2547dd31 service nova] [instance: 22339279-c381-4ccb-bba0-b0b554203e60] Received unexpected event network-vif-plugged-c089d5d4-e798-41e1-bac8-058f3a1fe9de for instance with vm_state building and task_state spawning. [ 1945.112789] env[59577]: DEBUG nova.compute.manager [req-2332c6b3-b41b-42a0-9ba2-96d07a907729 req-f47112b7-f4f1-45fb-b3a0-7faf2547dd31 service nova] [instance: 22339279-c381-4ccb-bba0-b0b554203e60] Received event network-changed-c089d5d4-e798-41e1-bac8-058f3a1fe9de {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1945.112969] env[59577]: DEBUG nova.compute.manager [req-2332c6b3-b41b-42a0-9ba2-96d07a907729 req-f47112b7-f4f1-45fb-b3a0-7faf2547dd31 service nova] [instance: 22339279-c381-4ccb-bba0-b0b554203e60] Refreshing instance network info cache due to event network-changed-c089d5d4-e798-41e1-bac8-058f3a1fe9de. {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1945.113193] env[59577]: DEBUG oslo_concurrency.lockutils [req-2332c6b3-b41b-42a0-9ba2-96d07a907729 req-f47112b7-f4f1-45fb-b3a0-7faf2547dd31 service nova] Acquiring lock "refresh_cache-22339279-c381-4ccb-bba0-b0b554203e60" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1945.113486] env[59577]: DEBUG oslo_concurrency.lockutils [req-2332c6b3-b41b-42a0-9ba2-96d07a907729 req-f47112b7-f4f1-45fb-b3a0-7faf2547dd31 service nova] Acquired lock "refresh_cache-22339279-c381-4ccb-bba0-b0b554203e60" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1945.113554] env[59577]: DEBUG nova.network.neutron [req-2332c6b3-b41b-42a0-9ba2-96d07a907729 req-f47112b7-f4f1-45fb-b3a0-7faf2547dd31 service nova] [instance: 22339279-c381-4ccb-bba0-b0b554203e60] Refreshing network info cache for port c089d5d4-e798-41e1-bac8-058f3a1fe9de {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1945.133357] env[59577]: DEBUG oslo_concurrency.lockutils [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1945.133556] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] [instance: e2c0fb3c-6cee-4be4-a368-0fa86a07ff88] Processing image d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1945.133777] env[59577]: DEBUG oslo_concurrency.lockutils [None req-d1f8aa9e-d6d5-4c5f-ba95-d151e178aa02 tempest-ServersAdmin275Test-1714938485 tempest-ServersAdmin275Test-1714938485-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1945.133983] env[59577]: DEBUG oslo_concurrency.lockutils [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1945.134293] env[59577]: DEBUG oslo_concurrency.lockutils [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1945.134535] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6b8f543a-6fe4-4c5b-b8eb-9ffbf28b72a2 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1945.140837] env[59577]: DEBUG oslo_vmware.api [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Waiting for the task: (returnval){ [ 1945.140837] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]526bcfb8-fada-f5da-b353-9ae4250f163a" [ 1945.140837] env[59577]: _type = "Task" [ 1945.140837] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1945.149403] env[59577]: DEBUG oslo_vmware.api [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]526bcfb8-fada-f5da-b353-9ae4250f163a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1945.321887] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933831, 'name': CreateVM_Task, 'duration_secs': 0.310886} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1945.322128] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9c96e4d7-30d3-44fc-b8d0-14271dc19ce3] Created VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1945.322838] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1945.460115] env[59577]: DEBUG nova.network.neutron [req-2332c6b3-b41b-42a0-9ba2-96d07a907729 req-f47112b7-f4f1-45fb-b3a0-7faf2547dd31 service nova] [instance: 22339279-c381-4ccb-bba0-b0b554203e60] Updated VIF entry in instance network info cache for port c089d5d4-e798-41e1-bac8-058f3a1fe9de. {{(pid=59577) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1945.460351] env[59577]: DEBUG nova.network.neutron [req-2332c6b3-b41b-42a0-9ba2-96d07a907729 req-f47112b7-f4f1-45fb-b3a0-7faf2547dd31 service nova] [instance: 22339279-c381-4ccb-bba0-b0b554203e60] Updating instance_info_cache with network_info: [{"id": "c089d5d4-e798-41e1-bac8-058f3a1fe9de", "address": "fa:16:3e:c2:f3:bd", "network": {"id": "3ba4c698-3ba8-452d-8b6f-9370193d0029", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1033918078-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e87cb8b9e19a447da79735de48c85852", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "eaf3dfa2-fa01-4d4d-8ecd-a9bc74d90ec2", "external-id": "nsx-vlan-transportzone-546", "segmentation_id": 546, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc089d5d4-e7", "ovs_interfaceid": "c089d5d4-e798-41e1-bac8-058f3a1fe9de", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1945.501105] env[59577]: DEBUG oslo_concurrency.lockutils [req-2332c6b3-b41b-42a0-9ba2-96d07a907729 req-f47112b7-f4f1-45fb-b3a0-7faf2547dd31 service nova] Releasing lock "refresh_cache-22339279-c381-4ccb-bba0-b0b554203e60" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1945.631799] env[59577]: DEBUG nova.network.neutron [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] [instance: 3d63cb9b-3c20-4c34-a96e-29e3dcea65a7] Successfully updated port: a67d68db-4bf8-431b-b399-abdf86da1d91 {{(pid=59577) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1945.640818] env[59577]: DEBUG oslo_concurrency.lockutils [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Acquiring lock "refresh_cache-3d63cb9b-3c20-4c34-a96e-29e3dcea65a7" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1945.642200] env[59577]: DEBUG oslo_concurrency.lockutils [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Acquired lock "refresh_cache-3d63cb9b-3c20-4c34-a96e-29e3dcea65a7" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1945.642200] env[59577]: DEBUG nova.network.neutron [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] [instance: 3d63cb9b-3c20-4c34-a96e-29e3dcea65a7] Building network info cache for instance {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1945.655055] env[59577]: DEBUG oslo_concurrency.lockutils [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1945.655318] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] [instance: 22339279-c381-4ccb-bba0-b0b554203e60] Processing image d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1945.655528] env[59577]: DEBUG oslo_concurrency.lockutils [None req-a8a5aced-895b-4c36-9406-825963c0c84e tempest-AttachVolumeNegativeTest-1299256984 tempest-AttachVolumeNegativeTest-1299256984-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1945.655969] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1945.656283] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1945.656831] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5158eec2-373f-4cb3-8775-eadf7c0b39b9 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1945.662242] env[59577]: DEBUG oslo_vmware.api [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Waiting for the task: (returnval){ [ 1945.662242] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]52178b24-8f9e-4b55-5d77-2564841a971e" [ 1945.662242] env[59577]: _type = "Task" [ 1945.662242] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1945.670429] env[59577]: DEBUG oslo_vmware.api [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]52178b24-8f9e-4b55-5d77-2564841a971e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1945.704213] env[59577]: DEBUG nova.network.neutron [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] [instance: 3d63cb9b-3c20-4c34-a96e-29e3dcea65a7] Instance cache missing network info. {{(pid=59577) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1945.946780] env[59577]: DEBUG nova.network.neutron [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] [instance: 3d63cb9b-3c20-4c34-a96e-29e3dcea65a7] Updating instance_info_cache with network_info: [{"id": "a67d68db-4bf8-431b-b399-abdf86da1d91", "address": "fa:16:3e:fc:aa:fc", "network": {"id": "3d5c6d22-5eec-4f78-ae77-17cb5d9e2ef4", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1604332240-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8d52da3118854611a3e2a8e5899b6ad9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6f1b07b1-e4e5-4842-9090-07fb2c3e124b", "external-id": "nsx-vlan-transportzone-646", "segmentation_id": 646, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa67d68db-4b", "ovs_interfaceid": "a67d68db-4bf8-431b-b399-abdf86da1d91", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1945.957221] env[59577]: DEBUG oslo_concurrency.lockutils [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Releasing lock "refresh_cache-3d63cb9b-3c20-4c34-a96e-29e3dcea65a7" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1945.957511] env[59577]: DEBUG nova.compute.manager [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] [instance: 3d63cb9b-3c20-4c34-a96e-29e3dcea65a7] Instance network_info: |[{"id": "a67d68db-4bf8-431b-b399-abdf86da1d91", "address": "fa:16:3e:fc:aa:fc", "network": {"id": "3d5c6d22-5eec-4f78-ae77-17cb5d9e2ef4", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1604332240-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8d52da3118854611a3e2a8e5899b6ad9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6f1b07b1-e4e5-4842-9090-07fb2c3e124b", "external-id": "nsx-vlan-transportzone-646", "segmentation_id": 646, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa67d68db-4b", "ovs_interfaceid": "a67d68db-4bf8-431b-b399-abdf86da1d91", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1945.957869] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] [instance: 3d63cb9b-3c20-4c34-a96e-29e3dcea65a7] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:fc:aa:fc', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6f1b07b1-e4e5-4842-9090-07fb2c3e124b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a67d68db-4bf8-431b-b399-abdf86da1d91', 'vif_model': 'vmxnet3'}] {{(pid=59577) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1945.965348] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Creating folder: Project (8d52da3118854611a3e2a8e5899b6ad9). Parent ref: group-v398749. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1945.965836] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b70faade-cc7e-4e1c-a029-002cae449759 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1945.976369] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Created folder: Project (8d52da3118854611a3e2a8e5899b6ad9) in parent group-v398749. [ 1945.976565] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Creating folder: Instances. Parent ref: group-v398808. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1945.976881] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-85eac8ee-7e61-4de7-aefb-9aaa75f768fa {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1945.984830] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Created folder: Instances in parent group-v398808. [ 1945.985063] env[59577]: DEBUG oslo.service.loopingcall [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59577) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1945.985243] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3d63cb9b-3c20-4c34-a96e-29e3dcea65a7] Creating VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1945.985426] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ed57604d-d3f0-4132-b88a-1ee8eb792626 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1946.005131] env[59577]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1946.005131] env[59577]: value = "task-1933834" [ 1946.005131] env[59577]: _type = "Task" [ 1946.005131] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1946.012525] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933834, 'name': CreateVM_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1946.172230] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1946.172545] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] [instance: 9c96e4d7-30d3-44fc-b8d0-14271dc19ce3] Processing image d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1946.172703] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7938d111-f9e1-4656-8b84-598c649be985 tempest-InstanceActionsNegativeTestJSON-291408576 tempest-InstanceActionsNegativeTestJSON-291408576-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1946.440131] env[59577]: DEBUG nova.compute.manager [req-534c4c89-5415-49fc-b45e-9615b36aa37e req-e1a75111-d3ac-4547-95c8-f21fde22ff50 service nova] [instance: 9c96e4d7-30d3-44fc-b8d0-14271dc19ce3] Received event network-changed-6b402131-56b2-48ab-bdc9-ab0e0d2a4709 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1946.440291] env[59577]: DEBUG nova.compute.manager [req-534c4c89-5415-49fc-b45e-9615b36aa37e req-e1a75111-d3ac-4547-95c8-f21fde22ff50 service nova] [instance: 9c96e4d7-30d3-44fc-b8d0-14271dc19ce3] Refreshing instance network info cache due to event network-changed-6b402131-56b2-48ab-bdc9-ab0e0d2a4709. {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1946.440506] env[59577]: DEBUG oslo_concurrency.lockutils [req-534c4c89-5415-49fc-b45e-9615b36aa37e req-e1a75111-d3ac-4547-95c8-f21fde22ff50 service nova] Acquiring lock "refresh_cache-9c96e4d7-30d3-44fc-b8d0-14271dc19ce3" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1946.440660] env[59577]: DEBUG oslo_concurrency.lockutils [req-534c4c89-5415-49fc-b45e-9615b36aa37e req-e1a75111-d3ac-4547-95c8-f21fde22ff50 service nova] Acquired lock "refresh_cache-9c96e4d7-30d3-44fc-b8d0-14271dc19ce3" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1946.440863] env[59577]: DEBUG nova.network.neutron [req-534c4c89-5415-49fc-b45e-9615b36aa37e req-e1a75111-d3ac-4547-95c8-f21fde22ff50 service nova] [instance: 9c96e4d7-30d3-44fc-b8d0-14271dc19ce3] Refreshing network info cache for port 6b402131-56b2-48ab-bdc9-ab0e0d2a4709 {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1946.516092] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933834, 'name': CreateVM_Task, 'duration_secs': 0.300163} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1946.516271] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3d63cb9b-3c20-4c34-a96e-29e3dcea65a7] Created VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1946.516936] env[59577]: DEBUG oslo_concurrency.lockutils [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1946.517112] env[59577]: DEBUG oslo_concurrency.lockutils [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1946.517422] env[59577]: DEBUG oslo_concurrency.lockutils [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1946.517654] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a88530ba-b77c-4075-a5da-80f5a763db5f {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1946.522399] env[59577]: DEBUG oslo_vmware.api [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Waiting for the task: (returnval){ [ 1946.522399] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]5217637d-4409-ce0e-094b-4d85e1b8625e" [ 1946.522399] env[59577]: _type = "Task" [ 1946.522399] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1946.531493] env[59577]: DEBUG oslo_vmware.api [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]5217637d-4409-ce0e-094b-4d85e1b8625e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1946.767404] env[59577]: DEBUG nova.compute.manager [req-b8bf4aba-4b87-4540-826f-3b31dbe2a1f0 req-e0d5a36f-dbfd-4f8f-bf97-8332afaf2f46 service nova] [instance: 3d63cb9b-3c20-4c34-a96e-29e3dcea65a7] Received event network-vif-plugged-a67d68db-4bf8-431b-b399-abdf86da1d91 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1946.767404] env[59577]: DEBUG oslo_concurrency.lockutils [req-b8bf4aba-4b87-4540-826f-3b31dbe2a1f0 req-e0d5a36f-dbfd-4f8f-bf97-8332afaf2f46 service nova] Acquiring lock "3d63cb9b-3c20-4c34-a96e-29e3dcea65a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1946.767606] env[59577]: DEBUG oslo_concurrency.lockutils [req-b8bf4aba-4b87-4540-826f-3b31dbe2a1f0 req-e0d5a36f-dbfd-4f8f-bf97-8332afaf2f46 service nova] Lock "3d63cb9b-3c20-4c34-a96e-29e3dcea65a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1946.767779] env[59577]: DEBUG oslo_concurrency.lockutils [req-b8bf4aba-4b87-4540-826f-3b31dbe2a1f0 req-e0d5a36f-dbfd-4f8f-bf97-8332afaf2f46 service nova] Lock "3d63cb9b-3c20-4c34-a96e-29e3dcea65a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1946.767944] env[59577]: DEBUG nova.compute.manager [req-b8bf4aba-4b87-4540-826f-3b31dbe2a1f0 req-e0d5a36f-dbfd-4f8f-bf97-8332afaf2f46 service nova] [instance: 3d63cb9b-3c20-4c34-a96e-29e3dcea65a7] No waiting events found dispatching network-vif-plugged-a67d68db-4bf8-431b-b399-abdf86da1d91 {{(pid=59577) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1946.768181] env[59577]: WARNING nova.compute.manager [req-b8bf4aba-4b87-4540-826f-3b31dbe2a1f0 req-e0d5a36f-dbfd-4f8f-bf97-8332afaf2f46 service nova] [instance: 3d63cb9b-3c20-4c34-a96e-29e3dcea65a7] Received unexpected event network-vif-plugged-a67d68db-4bf8-431b-b399-abdf86da1d91 for instance with vm_state building and task_state spawning. [ 1946.768354] env[59577]: DEBUG nova.compute.manager [req-b8bf4aba-4b87-4540-826f-3b31dbe2a1f0 req-e0d5a36f-dbfd-4f8f-bf97-8332afaf2f46 service nova] [instance: 3d63cb9b-3c20-4c34-a96e-29e3dcea65a7] Received event network-changed-a67d68db-4bf8-431b-b399-abdf86da1d91 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1946.768508] env[59577]: DEBUG nova.compute.manager [req-b8bf4aba-4b87-4540-826f-3b31dbe2a1f0 req-e0d5a36f-dbfd-4f8f-bf97-8332afaf2f46 service nova] [instance: 3d63cb9b-3c20-4c34-a96e-29e3dcea65a7] Refreshing instance network info cache due to event network-changed-a67d68db-4bf8-431b-b399-abdf86da1d91. {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1946.768711] env[59577]: DEBUG oslo_concurrency.lockutils [req-b8bf4aba-4b87-4540-826f-3b31dbe2a1f0 req-e0d5a36f-dbfd-4f8f-bf97-8332afaf2f46 service nova] Acquiring lock "refresh_cache-3d63cb9b-3c20-4c34-a96e-29e3dcea65a7" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1946.768856] env[59577]: DEBUG oslo_concurrency.lockutils [req-b8bf4aba-4b87-4540-826f-3b31dbe2a1f0 req-e0d5a36f-dbfd-4f8f-bf97-8332afaf2f46 service nova] Acquired lock "refresh_cache-3d63cb9b-3c20-4c34-a96e-29e3dcea65a7" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1946.769015] env[59577]: DEBUG nova.network.neutron [req-b8bf4aba-4b87-4540-826f-3b31dbe2a1f0 req-e0d5a36f-dbfd-4f8f-bf97-8332afaf2f46 service nova] [instance: 3d63cb9b-3c20-4c34-a96e-29e3dcea65a7] Refreshing network info cache for port a67d68db-4bf8-431b-b399-abdf86da1d91 {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1946.798083] env[59577]: DEBUG nova.network.neutron [req-534c4c89-5415-49fc-b45e-9615b36aa37e req-e1a75111-d3ac-4547-95c8-f21fde22ff50 service nova] [instance: 9c96e4d7-30d3-44fc-b8d0-14271dc19ce3] Updated VIF entry in instance network info cache for port 6b402131-56b2-48ab-bdc9-ab0e0d2a4709. {{(pid=59577) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1946.798388] env[59577]: DEBUG nova.network.neutron [req-534c4c89-5415-49fc-b45e-9615b36aa37e req-e1a75111-d3ac-4547-95c8-f21fde22ff50 service nova] [instance: 9c96e4d7-30d3-44fc-b8d0-14271dc19ce3] Updating instance_info_cache with network_info: [{"id": "6b402131-56b2-48ab-bdc9-ab0e0d2a4709", "address": "fa:16:3e:76:93:16", "network": {"id": "ad2922e8-aa8b-402d-af0f-beb0734c5e2a", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-104285320-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3c4c39d2e9c649b797455441eedf22d0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "298bb8ef-4765-494c-b157-7a349218bd1e", "external-id": "nsx-vlan-transportzone-905", "segmentation_id": 905, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6b402131-56", "ovs_interfaceid": "6b402131-56b2-48ab-bdc9-ab0e0d2a4709", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1946.807526] env[59577]: DEBUG oslo_concurrency.lockutils [req-534c4c89-5415-49fc-b45e-9615b36aa37e req-e1a75111-d3ac-4547-95c8-f21fde22ff50 service nova] Releasing lock "refresh_cache-9c96e4d7-30d3-44fc-b8d0-14271dc19ce3" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1946.807760] env[59577]: DEBUG nova.compute.manager [req-534c4c89-5415-49fc-b45e-9615b36aa37e req-e1a75111-d3ac-4547-95c8-f21fde22ff50 service nova] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Received event network-vif-deleted-b39a43ff-1ec7-49c5-9e5b-17b008629544 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1947.033264] env[59577]: DEBUG nova.network.neutron [req-b8bf4aba-4b87-4540-826f-3b31dbe2a1f0 req-e0d5a36f-dbfd-4f8f-bf97-8332afaf2f46 service nova] [instance: 3d63cb9b-3c20-4c34-a96e-29e3dcea65a7] Updated VIF entry in instance network info cache for port a67d68db-4bf8-431b-b399-abdf86da1d91. {{(pid=59577) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1947.033605] env[59577]: DEBUG nova.network.neutron [req-b8bf4aba-4b87-4540-826f-3b31dbe2a1f0 req-e0d5a36f-dbfd-4f8f-bf97-8332afaf2f46 service nova] [instance: 3d63cb9b-3c20-4c34-a96e-29e3dcea65a7] Updating instance_info_cache with network_info: [{"id": "a67d68db-4bf8-431b-b399-abdf86da1d91", "address": "fa:16:3e:fc:aa:fc", "network": {"id": "3d5c6d22-5eec-4f78-ae77-17cb5d9e2ef4", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1604332240-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8d52da3118854611a3e2a8e5899b6ad9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6f1b07b1-e4e5-4842-9090-07fb2c3e124b", "external-id": "nsx-vlan-transportzone-646", "segmentation_id": 646, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa67d68db-4b", "ovs_interfaceid": "a67d68db-4bf8-431b-b399-abdf86da1d91", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1947.034719] env[59577]: DEBUG oslo_concurrency.lockutils [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1947.035359] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] [instance: 3d63cb9b-3c20-4c34-a96e-29e3dcea65a7] Processing image d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1947.035359] env[59577]: DEBUG oslo_concurrency.lockutils [None req-541274f0-64bc-403b-9eba-3bf9394cd307 tempest-ImagesTestJSON-798858018 tempest-ImagesTestJSON-798858018-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1947.042514] env[59577]: DEBUG oslo_concurrency.lockutils [req-b8bf4aba-4b87-4540-826f-3b31dbe2a1f0 req-e0d5a36f-dbfd-4f8f-bf97-8332afaf2f46 service nova] Releasing lock "refresh_cache-3d63cb9b-3c20-4c34-a96e-29e3dcea65a7" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1967.044683] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1970.044260] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1972.046392] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1975.040151] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1976.045407] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1976.045778] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59577) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1976.045778] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1976.055728] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1976.055937] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1976.056123] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1976.056278] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59577) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1976.057321] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68305e6c-a03a-4b5e-b77b-580abaa811a3 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1976.065931] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c502fbec-61a3-4390-bc1b-0e3f5e6c2a4a {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1976.079450] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-321bcd76-341b-4a83-84b4-e4d3bda65bdf {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1976.085603] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9cbba84-50e3-4ca0-bb47-4c6425ea020d {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1976.115221] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181257MB free_disk=175GB free_vcpus=48 pci_devices=None {{(pid=59577) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1976.115361] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1976.115546] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1976.167336] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 8ed18ae2-2ba1-424c-b695-846afd7b3501 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1976.167499] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 5af13348-9f89-44b2-93bd-f9fb91598c73 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1976.167629] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 56259797-6883-437c-8942-5beca0e1ef7b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1976.167755] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 22339279-c381-4ccb-bba0-b0b554203e60 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1976.167875] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 9c96e4d7-30d3-44fc-b8d0-14271dc19ce3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1976.167992] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance e2c0fb3c-6cee-4be4-a368-0fa86a07ff88 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1976.168920] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 3d63cb9b-3c20-4c34-a96e-29e3dcea65a7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1976.168920] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1976.168920] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1976.255519] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e04e0c5-db46-4a04-8f8e-9dcfe40a3004 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1976.263153] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bda2936a-cc46-4d81-9aeb-4abbbda70425 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1976.292189] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80589301-89c9-4d11-9f74-c94d5839ff1e {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1976.299205] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3564e4f2-d288-4099-b525-8ab5989ae270 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1976.311571] env[59577]: DEBUG nova.compute.provider_tree [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1976.319488] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1976.331983] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59577) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1976.332169] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.217s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1978.332016] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1979.045533] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1982.046184] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1982.046561] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Starting heal instance info cache {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1982.046561] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Rebuilding the list of instances to heal {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1982.063771] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1982.063912] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1982.064057] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 56259797-6883-437c-8942-5beca0e1ef7b] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1982.064187] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 22339279-c381-4ccb-bba0-b0b554203e60] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1982.064324] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 9c96e4d7-30d3-44fc-b8d0-14271dc19ce3] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1982.064420] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: e2c0fb3c-6cee-4be4-a368-0fa86a07ff88] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1982.064548] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 3d63cb9b-3c20-4c34-a96e-29e3dcea65a7] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1982.064667] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Didn't find any instances for network info cache update. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1989.970258] env[59577]: WARNING oslo_vmware.rw_handles [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1989.970258] env[59577]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1989.970258] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1989.970258] env[59577]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1989.970258] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1989.970258] env[59577]: ERROR oslo_vmware.rw_handles response.begin() [ 1989.970258] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1989.970258] env[59577]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1989.970258] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1989.970258] env[59577]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1989.970258] env[59577]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1989.970258] env[59577]: ERROR oslo_vmware.rw_handles [ 1989.970961] env[59577]: DEBUG nova.virt.vmwareapi.images [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Downloaded image file data d5e691af-5903-46f6-a589-e220c4e5798c to vmware_temp/df6f2ef7-1be7-43f7-a732-287bd40e34ad/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk on the data store datastore1 {{(pid=59577) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1989.972416] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Caching image {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1989.972667] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Copying Virtual Disk [datastore1] vmware_temp/df6f2ef7-1be7-43f7-a732-287bd40e34ad/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk to [datastore1] vmware_temp/df6f2ef7-1be7-43f7-a732-287bd40e34ad/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk {{(pid=59577) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1989.972967] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-09768a83-4e46-4cf6-8465-853aaf66cebb {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1989.981774] env[59577]: DEBUG oslo_vmware.api [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Waiting for the task: (returnval){ [ 1989.981774] env[59577]: value = "task-1933835" [ 1989.981774] env[59577]: _type = "Task" [ 1989.981774] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1989.989829] env[59577]: DEBUG oslo_vmware.api [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Task: {'id': task-1933835, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1990.491818] env[59577]: DEBUG oslo_vmware.exceptions [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Fault InvalidArgument not matched. {{(pid=59577) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1990.493227] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1990.493227] env[59577]: ERROR nova.compute.manager [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1990.493227] env[59577]: Faults: ['InvalidArgument'] [ 1990.493227] env[59577]: ERROR nova.compute.manager [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Traceback (most recent call last): [ 1990.493227] env[59577]: ERROR nova.compute.manager [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1990.493227] env[59577]: ERROR nova.compute.manager [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] yield resources [ 1990.493227] env[59577]: ERROR nova.compute.manager [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1990.493227] env[59577]: ERROR nova.compute.manager [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] self.driver.spawn(context, instance, image_meta, [ 1990.493227] env[59577]: ERROR nova.compute.manager [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1990.493227] env[59577]: ERROR nova.compute.manager [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1990.493658] env[59577]: ERROR nova.compute.manager [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1990.493658] env[59577]: ERROR nova.compute.manager [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] self._fetch_image_if_missing(context, vi) [ 1990.493658] env[59577]: ERROR nova.compute.manager [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1990.493658] env[59577]: ERROR nova.compute.manager [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] image_cache(vi, tmp_image_ds_loc) [ 1990.493658] env[59577]: ERROR nova.compute.manager [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1990.493658] env[59577]: ERROR nova.compute.manager [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] vm_util.copy_virtual_disk( [ 1990.493658] env[59577]: ERROR nova.compute.manager [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1990.493658] env[59577]: ERROR nova.compute.manager [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] session._wait_for_task(vmdk_copy_task) [ 1990.493658] env[59577]: ERROR nova.compute.manager [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1990.493658] env[59577]: ERROR nova.compute.manager [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] return self.wait_for_task(task_ref) [ 1990.493658] env[59577]: ERROR nova.compute.manager [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1990.493658] env[59577]: ERROR nova.compute.manager [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] return evt.wait() [ 1990.493658] env[59577]: ERROR nova.compute.manager [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1990.494073] env[59577]: ERROR nova.compute.manager [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] result = hub.switch() [ 1990.494073] env[59577]: ERROR nova.compute.manager [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1990.494073] env[59577]: ERROR nova.compute.manager [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] return self.greenlet.switch() [ 1990.494073] env[59577]: ERROR nova.compute.manager [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1990.494073] env[59577]: ERROR nova.compute.manager [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] self.f(*self.args, **self.kw) [ 1990.494073] env[59577]: ERROR nova.compute.manager [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1990.494073] env[59577]: ERROR nova.compute.manager [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] raise exceptions.translate_fault(task_info.error) [ 1990.494073] env[59577]: ERROR nova.compute.manager [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1990.494073] env[59577]: ERROR nova.compute.manager [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Faults: ['InvalidArgument'] [ 1990.494073] env[59577]: ERROR nova.compute.manager [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] [ 1990.494073] env[59577]: INFO nova.compute.manager [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Terminating instance [ 1990.495096] env[59577]: DEBUG oslo_concurrency.lockutils [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1990.495096] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1990.495299] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b604524c-25d9-4575-a838-ab83e0a21d7d {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1990.497502] env[59577]: DEBUG nova.compute.manager [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Start destroying the instance on the hypervisor. {{(pid=59577) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1990.497689] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Destroying instance {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1990.498416] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c978a90d-f21e-4c8e-aeee-ccc60a9c3bcc {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1990.505095] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Unregistering the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1990.505302] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2bb6b547-47b1-4bac-9635-7bf169953738 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1990.507385] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1990.507562] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59577) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1990.508486] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8c7aa40f-eed0-4e9c-9e2e-72400bd08078 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1990.513258] env[59577]: DEBUG oslo_vmware.api [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Waiting for the task: (returnval){ [ 1990.513258] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]52651c45-3477-00b8-7756-829a203b0800" [ 1990.513258] env[59577]: _type = "Task" [ 1990.513258] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1990.520571] env[59577]: DEBUG oslo_vmware.api [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]52651c45-3477-00b8-7756-829a203b0800, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1990.589296] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Unregistered the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1990.589511] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Deleting contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1990.589731] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Deleting the datastore file [datastore1] b8002da2-eecd-490a-a34b-c651c28c57fc {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1990.589994] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1ca1d105-5558-4ba2-81b2-2ef728d0b143 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1990.595730] env[59577]: DEBUG oslo_vmware.api [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Waiting for the task: (returnval){ [ 1990.595730] env[59577]: value = "task-1933837" [ 1990.595730] env[59577]: _type = "Task" [ 1990.595730] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1990.603503] env[59577]: DEBUG oslo_vmware.api [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Task: {'id': task-1933837, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1991.023266] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Preparing fetch location {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1991.023526] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Creating directory with path [datastore1] vmware_temp/305ff74f-5edb-4184-94e0-cf4c6f2166ac/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1991.023793] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9ce2f367-adb2-4b7f-9a56-ce97f095503e {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1991.035421] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Created directory with path [datastore1] vmware_temp/305ff74f-5edb-4184-94e0-cf4c6f2166ac/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1991.035610] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Fetch image to [datastore1] vmware_temp/305ff74f-5edb-4184-94e0-cf4c6f2166ac/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1991.035780] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to [datastore1] vmware_temp/305ff74f-5edb-4184-94e0-cf4c6f2166ac/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk on the data store datastore1 {{(pid=59577) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1991.036584] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a19c4559-4915-41aa-bd15-ddf360eda596 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1991.043516] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba40e23b-aac6-481f-96de-0298a8ad6bc2 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1991.053970] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5290dda-06d0-4bd9-ad8f-eeff506ee47e {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1991.057962] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1991.085086] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c061d6b-dad6-4f46-b21c-496746ad0e17 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1991.093527] env[59577]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-2f337500-b40f-4e96-b955-2d0f07da7253 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1991.104207] env[59577]: DEBUG oslo_vmware.api [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Task: {'id': task-1933837, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076871} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1991.104442] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Deleted the datastore file {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1991.104643] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Deleted contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1991.104822] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Instance destroyed {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1991.104987] env[59577]: INFO nova.compute.manager [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1991.107014] env[59577]: DEBUG nova.compute.claims [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Aborting claim: {{(pid=59577) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1991.107159] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1991.107400] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1991.114218] env[59577]: DEBUG nova.virt.vmwareapi.images [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to the data store datastore1 {{(pid=59577) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1991.131537] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.024s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1991.132217] env[59577]: DEBUG nova.compute.utils [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Instance b8002da2-eecd-490a-a34b-c651c28c57fc could not be found. {{(pid=59577) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1991.134030] env[59577]: DEBUG nova.compute.manager [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Instance disappeared during build. {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1991.134202] env[59577]: DEBUG nova.compute.manager [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Unplugging VIFs for instance {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1991.134369] env[59577]: DEBUG nova.compute.manager [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1991.134536] env[59577]: DEBUG nova.compute.manager [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Deallocating network for instance {{(pid=59577) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1991.134699] env[59577]: DEBUG nova.network.neutron [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] deallocate_for_instance() {{(pid=59577) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1991.161476] env[59577]: DEBUG nova.network.neutron [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Updating instance_info_cache with network_info: [] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1991.170895] env[59577]: INFO nova.compute.manager [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: b8002da2-eecd-490a-a34b-c651c28c57fc] Took 0.04 seconds to deallocate network for instance. [ 1991.211861] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ea287114-154f-4764-80ca-7b92785586e6 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Lock "b8002da2-eecd-490a-a34b-c651c28c57fc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 353.679s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1991.240708] env[59577]: DEBUG oslo_concurrency.lockutils [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1991.241525] env[59577]: ERROR nova.compute.manager [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image d5e691af-5903-46f6-a589-e220c4e5798c. [ 1991.241525] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Traceback (most recent call last): [ 1991.241525] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1991.241525] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1991.241525] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1991.241525] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] result = getattr(controller, method)(*args, **kwargs) [ 1991.241525] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1991.241525] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] return self._get(image_id) [ 1991.241525] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1991.241525] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1991.241525] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1991.242033] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] resp, body = self.http_client.get(url, headers=header) [ 1991.242033] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1991.242033] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] return self.request(url, 'GET', **kwargs) [ 1991.242033] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1991.242033] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] return self._handle_response(resp) [ 1991.242033] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1991.242033] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] raise exc.from_response(resp, resp.content) [ 1991.242033] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1991.242033] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] [ 1991.242033] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] During handling of the above exception, another exception occurred: [ 1991.242033] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] [ 1991.242033] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Traceback (most recent call last): [ 1991.242523] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1991.242523] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] yield resources [ 1991.242523] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1991.242523] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] self.driver.spawn(context, instance, image_meta, [ 1991.242523] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1991.242523] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1991.242523] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1991.242523] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] self._fetch_image_if_missing(context, vi) [ 1991.242523] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1991.242523] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] image_fetch(context, vi, tmp_image_ds_loc) [ 1991.242523] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1991.242523] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] images.fetch_image( [ 1991.242523] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1991.242931] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] metadata = IMAGE_API.get(context, image_ref) [ 1991.242931] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1991.242931] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] return session.show(context, image_id, [ 1991.242931] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1991.242931] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] _reraise_translated_image_exception(image_id) [ 1991.242931] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1991.242931] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] raise new_exc.with_traceback(exc_trace) [ 1991.242931] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1991.242931] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1991.242931] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1991.242931] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] result = getattr(controller, method)(*args, **kwargs) [ 1991.242931] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1991.242931] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] return self._get(image_id) [ 1991.243282] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1991.243282] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1991.243282] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1991.243282] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] resp, body = self.http_client.get(url, headers=header) [ 1991.243282] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1991.243282] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] return self.request(url, 'GET', **kwargs) [ 1991.243282] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1991.243282] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] return self._handle_response(resp) [ 1991.243282] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1991.243282] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] raise exc.from_response(resp, resp.content) [ 1991.243282] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] nova.exception.ImageNotAuthorized: Not authorized for image d5e691af-5903-46f6-a589-e220c4e5798c. [ 1991.243282] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] [ 1991.243558] env[59577]: INFO nova.compute.manager [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Terminating instance [ 1991.243558] env[59577]: DEBUG oslo_concurrency.lockutils [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1991.243616] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1991.244205] env[59577]: DEBUG nova.compute.manager [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Start destroying the instance on the hypervisor. {{(pid=59577) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1991.244396] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Destroying instance {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1991.244625] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6cdaea21-7e86-4abf-ac1a-175a05205f91 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1991.247180] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd71d9ea-3a8e-4cf8-8455-db1cdbb82e8d {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1991.255142] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Unregistering the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1991.256172] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-80c1bbd4-4af0-4d90-8a78-cda4957697e1 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1991.257589] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1991.257772] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59577) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1991.258435] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fe0e5e0a-8d88-40c7-b62e-64093902b727 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1991.263934] env[59577]: DEBUG oslo_vmware.api [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Waiting for the task: (returnval){ [ 1991.263934] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]52fd670e-d5d0-c1a3-a34f-ac45eec403aa" [ 1991.263934] env[59577]: _type = "Task" [ 1991.263934] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1991.271500] env[59577]: DEBUG oslo_vmware.api [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]52fd670e-d5d0-c1a3-a34f-ac45eec403aa, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1991.342932] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Unregistered the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1991.343173] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Deleting contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1991.343357] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Deleting the datastore file [datastore1] 077b8c8d-ee7e-495b-a7f7-676fe7c70f83 {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1991.343619] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-49453325-aa43-46a7-ab9b-42b68b0421c2 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1991.350586] env[59577]: DEBUG oslo_vmware.api [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Waiting for the task: (returnval){ [ 1991.350586] env[59577]: value = "task-1933839" [ 1991.350586] env[59577]: _type = "Task" [ 1991.350586] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1991.358497] env[59577]: DEBUG oslo_vmware.api [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Task: {'id': task-1933839, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1991.774483] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Preparing fetch location {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1991.774750] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Creating directory with path [datastore1] vmware_temp/5fa7b89f-f979-415e-a549-974a737092d7/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1991.775010] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-236a0541-6147-46a1-9e8b-ee7a343d8b5b {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1991.785821] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Created directory with path [datastore1] vmware_temp/5fa7b89f-f979-415e-a549-974a737092d7/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1991.786020] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Fetch image to [datastore1] vmware_temp/5fa7b89f-f979-415e-a549-974a737092d7/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1991.786220] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to [datastore1] vmware_temp/5fa7b89f-f979-415e-a549-974a737092d7/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk on the data store datastore1 {{(pid=59577) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1991.786942] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16d4aa19-e77f-46f0-b7ae-ae0be1124706 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1991.793671] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef35c889-99d8-4208-a486-b5df009d9399 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1991.803585] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8cd1c5aa-a754-497e-9a14-68c8e7c9ca09 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1991.833700] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49349e82-e8df-4783-9203-fd12460ea4f3 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1991.839292] env[59577]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b7065ac4-53a0-430f-9077-657985c9ab1b {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1991.858939] env[59577]: DEBUG oslo_vmware.api [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Task: {'id': task-1933839, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072854} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1991.860372] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Deleted the datastore file {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1991.860515] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Deleted contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1991.860691] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Instance destroyed {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1991.860867] env[59577]: INFO nova.compute.manager [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1991.862626] env[59577]: DEBUG nova.virt.vmwareapi.images [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to the data store datastore1 {{(pid=59577) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1991.864687] env[59577]: DEBUG nova.compute.claims [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Aborting claim: {{(pid=59577) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1991.864859] env[59577]: DEBUG oslo_concurrency.lockutils [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1991.865096] env[59577]: DEBUG oslo_concurrency.lockutils [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1991.896741] env[59577]: DEBUG oslo_concurrency.lockutils [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.032s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1991.897466] env[59577]: DEBUG nova.compute.utils [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Instance 077b8c8d-ee7e-495b-a7f7-676fe7c70f83 could not be found. {{(pid=59577) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1991.898877] env[59577]: DEBUG nova.compute.manager [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Instance disappeared during build. {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1991.899100] env[59577]: DEBUG nova.compute.manager [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Unplugging VIFs for instance {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1991.899270] env[59577]: DEBUG nova.compute.manager [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1991.899438] env[59577]: DEBUG nova.compute.manager [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Deallocating network for instance {{(pid=59577) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1991.899597] env[59577]: DEBUG nova.network.neutron [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] deallocate_for_instance() {{(pid=59577) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1991.913044] env[59577]: DEBUG oslo_vmware.rw_handles [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5fa7b89f-f979-415e-a549-974a737092d7/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59577) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1991.971890] env[59577]: DEBUG oslo_vmware.rw_handles [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Completed reading data from the image iterator. {{(pid=59577) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1991.972153] env[59577]: DEBUG oslo_vmware.rw_handles [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5fa7b89f-f979-415e-a549-974a737092d7/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59577) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1992.045395] env[59577]: DEBUG neutronclient.v2_0.client [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59577) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1992.047277] env[59577]: ERROR nova.compute.manager [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1992.047277] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Traceback (most recent call last): [ 1992.047277] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1992.047277] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1992.047277] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1992.047277] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] result = getattr(controller, method)(*args, **kwargs) [ 1992.047277] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1992.047277] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] return self._get(image_id) [ 1992.047277] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1992.047277] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1992.047277] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1992.047595] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] resp, body = self.http_client.get(url, headers=header) [ 1992.047595] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1992.047595] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] return self.request(url, 'GET', **kwargs) [ 1992.047595] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1992.047595] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] return self._handle_response(resp) [ 1992.047595] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1992.047595] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] raise exc.from_response(resp, resp.content) [ 1992.047595] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1992.047595] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] [ 1992.047595] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] During handling of the above exception, another exception occurred: [ 1992.047595] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] [ 1992.047595] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Traceback (most recent call last): [ 1992.047921] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1992.047921] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] self.driver.spawn(context, instance, image_meta, [ 1992.047921] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1992.047921] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1992.047921] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1992.047921] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] self._fetch_image_if_missing(context, vi) [ 1992.047921] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1992.047921] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] image_fetch(context, vi, tmp_image_ds_loc) [ 1992.047921] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1992.047921] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] images.fetch_image( [ 1992.047921] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1992.047921] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] metadata = IMAGE_API.get(context, image_ref) [ 1992.047921] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1992.048266] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] return session.show(context, image_id, [ 1992.048266] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1992.048266] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] _reraise_translated_image_exception(image_id) [ 1992.048266] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1992.048266] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] raise new_exc.with_traceback(exc_trace) [ 1992.048266] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1992.048266] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1992.048266] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1992.048266] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] result = getattr(controller, method)(*args, **kwargs) [ 1992.048266] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1992.048266] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] return self._get(image_id) [ 1992.048266] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1992.048266] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1992.048556] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1992.048556] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] resp, body = self.http_client.get(url, headers=header) [ 1992.048556] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1992.048556] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] return self.request(url, 'GET', **kwargs) [ 1992.048556] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1992.048556] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] return self._handle_response(resp) [ 1992.048556] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1992.048556] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] raise exc.from_response(resp, resp.content) [ 1992.048556] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] nova.exception.ImageNotAuthorized: Not authorized for image d5e691af-5903-46f6-a589-e220c4e5798c. [ 1992.048556] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] [ 1992.048556] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] During handling of the above exception, another exception occurred: [ 1992.048556] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] [ 1992.048556] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Traceback (most recent call last): [ 1992.048942] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1992.048942] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] self._build_and_run_instance(context, instance, image, [ 1992.048942] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1992.048942] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] with excutils.save_and_reraise_exception(): [ 1992.048942] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1992.048942] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] self.force_reraise() [ 1992.048942] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1992.048942] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] raise self.value [ 1992.048942] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1992.048942] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] with self.rt.instance_claim(context, instance, node, allocs, [ 1992.048942] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1992.048942] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] self.abort() [ 1992.048942] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1992.049362] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1992.049362] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1992.049362] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] return f(*args, **kwargs) [ 1992.049362] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1992.049362] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] self._unset_instance_host_and_node(instance) [ 1992.049362] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1992.049362] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] instance.save() [ 1992.049362] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1992.049362] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] updates, result = self.indirection_api.object_action( [ 1992.049362] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1992.049362] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] return cctxt.call(context, 'object_action', objinst=objinst, [ 1992.049362] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1992.049362] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] result = self.transport._send( [ 1992.049859] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1992.049859] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] return self._driver.send(target, ctxt, message, [ 1992.049859] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1992.049859] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1992.049859] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1992.049859] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] raise result [ 1992.049859] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] nova.exception_Remote.InstanceNotFound_Remote: Instance 077b8c8d-ee7e-495b-a7f7-676fe7c70f83 could not be found. [ 1992.049859] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Traceback (most recent call last): [ 1992.049859] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] [ 1992.049859] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1992.049859] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] return getattr(target, method)(*args, **kwargs) [ 1992.049859] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] [ 1992.049859] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1992.050259] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] return fn(self, *args, **kwargs) [ 1992.050259] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] [ 1992.050259] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1992.050259] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] old_ref, inst_ref = db.instance_update_and_get_original( [ 1992.050259] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] [ 1992.050259] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1992.050259] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] return f(*args, **kwargs) [ 1992.050259] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] [ 1992.050259] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1992.050259] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] with excutils.save_and_reraise_exception() as ectxt: [ 1992.050259] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] [ 1992.050259] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1992.050259] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] self.force_reraise() [ 1992.050259] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] [ 1992.050259] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1992.050932] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] raise self.value [ 1992.050932] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] [ 1992.050932] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1992.050932] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] return f(*args, **kwargs) [ 1992.050932] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] [ 1992.050932] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1992.050932] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] return f(context, *args, **kwargs) [ 1992.050932] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] [ 1992.050932] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1992.050932] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1992.050932] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] [ 1992.050932] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1992.050932] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] raise exception.InstanceNotFound(instance_id=uuid) [ 1992.050932] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] [ 1992.050932] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] nova.exception.InstanceNotFound: Instance 077b8c8d-ee7e-495b-a7f7-676fe7c70f83 could not be found. [ 1992.051498] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] [ 1992.051498] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] [ 1992.051498] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] During handling of the above exception, another exception occurred: [ 1992.051498] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] [ 1992.051498] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Traceback (most recent call last): [ 1992.051498] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1992.051498] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] ret = obj(*args, **kwargs) [ 1992.051498] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1992.051498] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] exception_handler_v20(status_code, error_body) [ 1992.051498] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1992.051498] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] raise client_exc(message=error_message, [ 1992.051498] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1992.051498] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Neutron server returns request_ids: ['req-4e1b1af5-3678-4c6b-a466-e2f7143b3b5b'] [ 1992.051498] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] [ 1992.051891] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] During handling of the above exception, another exception occurred: [ 1992.051891] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] [ 1992.051891] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] Traceback (most recent call last): [ 1992.051891] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1992.051891] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] self._deallocate_network(context, instance, requested_networks) [ 1992.051891] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1992.051891] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] self.network_api.deallocate_for_instance( [ 1992.051891] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1992.051891] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] data = neutron.list_ports(**search_opts) [ 1992.051891] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1992.051891] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] ret = obj(*args, **kwargs) [ 1992.051891] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1992.051891] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] return self.list('ports', self.ports_path, retrieve_all, [ 1992.052257] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1992.052257] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] ret = obj(*args, **kwargs) [ 1992.052257] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1992.052257] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] for r in self._pagination(collection, path, **params): [ 1992.052257] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1992.052257] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] res = self.get(path, params=params) [ 1992.052257] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1992.052257] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] ret = obj(*args, **kwargs) [ 1992.052257] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1992.052257] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] return self.retry_request("GET", action, body=body, [ 1992.052257] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1992.052257] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] ret = obj(*args, **kwargs) [ 1992.052257] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1992.052668] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] return self.do_request(method, action, body=body, [ 1992.052668] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1992.052668] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] ret = obj(*args, **kwargs) [ 1992.052668] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1992.052668] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] self._handle_fault_response(status_code, replybody, resp) [ 1992.052668] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1992.052668] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] raise exception.Unauthorized() [ 1992.052668] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] nova.exception.Unauthorized: Not authorized. [ 1992.052668] env[59577]: ERROR nova.compute.manager [instance: 077b8c8d-ee7e-495b-a7f7-676fe7c70f83] [ 1992.068273] env[59577]: DEBUG oslo_concurrency.lockutils [None req-4967f494-91a1-4024-8323-18344959bf88 tempest-AttachVolumeShelveTestJSON-1872904028 tempest-AttachVolumeShelveTestJSON-1872904028-project-member] Lock "077b8c8d-ee7e-495b-a7f7-676fe7c70f83" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 353.756s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 2027.046051] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 2031.045118] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 2032.044998] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 2036.040328] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 2037.045550] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 2037.055287] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 2037.055512] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 2037.055675] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 2037.055832] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59577) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 2037.056887] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa2c1b4e-af83-4ff3-b275-0727b6075407 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2037.065628] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09f1a5a1-72ea-4c6b-a663-86a238266e3f {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2037.080672] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68a78f26-be06-4527-82b9-2931d906f02a {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2037.087121] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6b632ae-959f-4558-a402-9f784a0bb987 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2037.115689] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181320MB free_disk=175GB free_vcpus=48 pci_devices=None {{(pid=59577) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 2037.115846] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 2037.116063] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 2037.169783] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 8ed18ae2-2ba1-424c-b695-846afd7b3501 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 2037.169945] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 5af13348-9f89-44b2-93bd-f9fb91598c73 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 2037.170095] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 56259797-6883-437c-8942-5beca0e1ef7b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 2037.170227] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 22339279-c381-4ccb-bba0-b0b554203e60 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 2037.170348] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 9c96e4d7-30d3-44fc-b8d0-14271dc19ce3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 2037.170468] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance e2c0fb3c-6cee-4be4-a368-0fa86a07ff88 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 2037.170589] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance 3d63cb9b-3c20-4c34-a96e-29e3dcea65a7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 2037.170792] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 2037.170934] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 2037.255238] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a59122a-c4de-4f60-8f0c-5911d97b9060 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2037.264059] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7ef2a64-1f0a-472d-9188-7896e6203c5b {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2037.293993] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7aa54a5-c72c-45b5-8ff6-ec6faa533ece {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2037.300906] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2ae10d6-b0ff-4a35-a56b-ac2dae8d71c8 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2037.313403] env[59577]: DEBUG nova.compute.provider_tree [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2037.321400] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2037.336422] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59577) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 2037.336608] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.221s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 2038.170543] env[59577]: WARNING oslo_vmware.rw_handles [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2038.170543] env[59577]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2038.170543] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2038.170543] env[59577]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2038.170543] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2038.170543] env[59577]: ERROR oslo_vmware.rw_handles response.begin() [ 2038.170543] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2038.170543] env[59577]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2038.170543] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2038.170543] env[59577]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2038.170543] env[59577]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2038.170543] env[59577]: ERROR oslo_vmware.rw_handles [ 2038.170543] env[59577]: DEBUG nova.virt.vmwareapi.images [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Downloaded image file data d5e691af-5903-46f6-a589-e220c4e5798c to vmware_temp/5fa7b89f-f979-415e-a549-974a737092d7/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk on the data store datastore1 {{(pid=59577) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2038.172601] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Caching image {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2038.172601] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Copying Virtual Disk [datastore1] vmware_temp/5fa7b89f-f979-415e-a549-974a737092d7/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk to [datastore1] vmware_temp/5fa7b89f-f979-415e-a549-974a737092d7/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk {{(pid=59577) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2038.172833] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a593212e-7348-4b2b-8cb6-0b6343347e64 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2038.180380] env[59577]: DEBUG oslo_vmware.api [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Waiting for the task: (returnval){ [ 2038.180380] env[59577]: value = "task-1933840" [ 2038.180380] env[59577]: _type = "Task" [ 2038.180380] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 2038.188474] env[59577]: DEBUG oslo_vmware.api [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Task: {'id': task-1933840, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 2038.336269] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 2038.336525] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 2038.336678] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59577) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 2038.692427] env[59577]: DEBUG oslo_vmware.exceptions [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Fault InvalidArgument not matched. {{(pid=59577) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 2038.692427] env[59577]: DEBUG oslo_concurrency.lockutils [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 2038.692606] env[59577]: ERROR nova.compute.manager [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2038.692606] env[59577]: Faults: ['InvalidArgument'] [ 2038.692606] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Traceback (most recent call last): [ 2038.692606] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 2038.692606] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] yield resources [ 2038.692606] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 2038.692606] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] self.driver.spawn(context, instance, image_meta, [ 2038.692606] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 2038.692606] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2038.692606] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2038.692606] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] self._fetch_image_if_missing(context, vi) [ 2038.692606] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2038.692938] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] image_cache(vi, tmp_image_ds_loc) [ 2038.692938] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2038.692938] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] vm_util.copy_virtual_disk( [ 2038.692938] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2038.692938] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] session._wait_for_task(vmdk_copy_task) [ 2038.692938] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2038.692938] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] return self.wait_for_task(task_ref) [ 2038.692938] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2038.692938] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] return evt.wait() [ 2038.692938] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 2038.692938] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] result = hub.switch() [ 2038.692938] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 2038.692938] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] return self.greenlet.switch() [ 2038.693282] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2038.693282] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] self.f(*self.args, **self.kw) [ 2038.693282] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2038.693282] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] raise exceptions.translate_fault(task_info.error) [ 2038.693282] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2038.693282] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Faults: ['InvalidArgument'] [ 2038.693282] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] [ 2038.693282] env[59577]: INFO nova.compute.manager [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Terminating instance [ 2038.694960] env[59577]: DEBUG oslo_concurrency.lockutils [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 2038.694960] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2038.694960] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-dab275fc-4c3a-4c71-a19f-031affd372ef {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2038.696975] env[59577]: DEBUG nova.compute.manager [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Start destroying the instance on the hypervisor. {{(pid=59577) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 2038.697201] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Destroying instance {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2038.697986] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f3d27e8-91d6-46c1-a334-1eb64960658e {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2038.704869] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Unregistering the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2038.705096] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-cfe991e3-c475-42f4-b68c-576345ac5ac6 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2038.707280] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2038.707452] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59577) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2038.708417] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-de6c2063-c674-4259-9e93-c1a4514dae6f {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2038.712996] env[59577]: DEBUG oslo_vmware.api [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Waiting for the task: (returnval){ [ 2038.712996] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]52c9a74f-3525-3979-c82d-bf542fdcf950" [ 2038.712996] env[59577]: _type = "Task" [ 2038.712996] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 2038.720006] env[59577]: DEBUG oslo_vmware.api [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]52c9a74f-3525-3979-c82d-bf542fdcf950, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 2039.224213] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Preparing fetch location {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2039.224558] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Creating directory with path [datastore1] vmware_temp/50661a85-9ff9-4634-aa2e-d8f3bb1a59ab/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2039.224680] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-303353d5-853e-4a4d-a672-e14319835778 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2039.244066] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Created directory with path [datastore1] vmware_temp/50661a85-9ff9-4634-aa2e-d8f3bb1a59ab/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2039.244278] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Fetch image to [datastore1] vmware_temp/50661a85-9ff9-4634-aa2e-d8f3bb1a59ab/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2039.244450] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to [datastore1] vmware_temp/50661a85-9ff9-4634-aa2e-d8f3bb1a59ab/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk on the data store datastore1 {{(pid=59577) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2039.245190] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e4ec8f6-1e3e-4a20-b94a-e681db0e59a5 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2039.251616] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e0dcd7d-747d-452a-ac9f-9cb7d4f6f633 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2039.260421] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c1bba72-0dc0-4288-95f5-c91dfc2c434e {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2039.932821] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0486deaf-9295-4010-b1ab-81cae85fdb3c {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2039.935352] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Unregistered the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2039.935531] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Deleting contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2039.935702] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Deleting the datastore file [datastore1] 8ed18ae2-2ba1-424c-b695-846afd7b3501 {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2039.935921] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f3a5724d-82c5-4bff-baa9-e679d1b8fce5 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2039.941347] env[59577]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c0339926-6e44-4a1f-b0a6-e17badc5f6ac {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2039.943821] env[59577]: DEBUG oslo_vmware.api [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Waiting for the task: (returnval){ [ 2039.943821] env[59577]: value = "task-1933842" [ 2039.943821] env[59577]: _type = "Task" [ 2039.943821] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 2039.951364] env[59577]: DEBUG oslo_vmware.api [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Task: {'id': task-1933842, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 2039.966548] env[59577]: DEBUG nova.virt.vmwareapi.images [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to the data store datastore1 {{(pid=59577) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2040.018282] env[59577]: DEBUG oslo_vmware.rw_handles [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/50661a85-9ff9-4634-aa2e-d8f3bb1a59ab/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59577) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 2040.071490] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 2040.076156] env[59577]: DEBUG oslo_vmware.rw_handles [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Completed reading data from the image iterator. {{(pid=59577) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 2040.076439] env[59577]: DEBUG oslo_vmware.rw_handles [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/50661a85-9ff9-4634-aa2e-d8f3bb1a59ab/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59577) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 2040.454373] env[59577]: DEBUG oslo_vmware.api [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Task: {'id': task-1933842, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072411} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 2040.454695] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Deleted the datastore file {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2040.454695] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Deleted contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2040.454888] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Instance destroyed {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2040.455014] env[59577]: INFO nova.compute.manager [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Took 1.76 seconds to destroy the instance on the hypervisor. [ 2040.457160] env[59577]: DEBUG nova.compute.claims [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Aborting claim: {{(pid=59577) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2040.457329] env[59577]: DEBUG oslo_concurrency.lockutils [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 2040.457534] env[59577]: DEBUG oslo_concurrency.lockutils [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 2040.583035] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9a8455b-64a3-4d43-ab04-ee3726760d86 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2040.590371] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3fde485e-020a-461d-81ff-73b96651569a {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2040.618828] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8133aaef-5b05-4a96-9e7b-122dd2281959 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2040.626017] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75f59500-11c3-4533-adc9-1d77aa306f20 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2040.639448] env[59577]: DEBUG nova.compute.provider_tree [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2040.647593] env[59577]: DEBUG nova.scheduler.client.report [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2040.660886] env[59577]: DEBUG oslo_concurrency.lockutils [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.203s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 2040.661013] env[59577]: ERROR nova.compute.manager [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2040.661013] env[59577]: Faults: ['InvalidArgument'] [ 2040.661013] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Traceback (most recent call last): [ 2040.661013] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 2040.661013] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] self.driver.spawn(context, instance, image_meta, [ 2040.661013] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 2040.661013] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2040.661013] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2040.661013] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] self._fetch_image_if_missing(context, vi) [ 2040.661013] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2040.661013] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] image_cache(vi, tmp_image_ds_loc) [ 2040.661013] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2040.661480] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] vm_util.copy_virtual_disk( [ 2040.661480] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2040.661480] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] session._wait_for_task(vmdk_copy_task) [ 2040.661480] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2040.661480] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] return self.wait_for_task(task_ref) [ 2040.661480] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2040.661480] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] return evt.wait() [ 2040.661480] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 2040.661480] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] result = hub.switch() [ 2040.661480] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 2040.661480] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] return self.greenlet.switch() [ 2040.661480] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2040.661480] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] self.f(*self.args, **self.kw) [ 2040.661785] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2040.661785] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] raise exceptions.translate_fault(task_info.error) [ 2040.661785] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2040.661785] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Faults: ['InvalidArgument'] [ 2040.661785] env[59577]: ERROR nova.compute.manager [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] [ 2040.661785] env[59577]: DEBUG nova.compute.utils [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] VimFaultException {{(pid=59577) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2040.663396] env[59577]: DEBUG nova.compute.manager [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Build of instance 8ed18ae2-2ba1-424c-b695-846afd7b3501 was re-scheduled: A specified parameter was not correct: fileType [ 2040.663396] env[59577]: Faults: ['InvalidArgument'] {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 2040.663763] env[59577]: DEBUG nova.compute.manager [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Unplugging VIFs for instance {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 2040.663933] env[59577]: DEBUG nova.compute.manager [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 2040.664110] env[59577]: DEBUG nova.compute.manager [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Deallocating network for instance {{(pid=59577) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 2040.664268] env[59577]: DEBUG nova.network.neutron [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] deallocate_for_instance() {{(pid=59577) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2041.106339] env[59577]: DEBUG nova.network.neutron [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Updating instance_info_cache with network_info: [] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2041.121155] env[59577]: INFO nova.compute.manager [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Took 0.46 seconds to deallocate network for instance. [ 2041.211254] env[59577]: INFO nova.scheduler.client.report [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Deleted allocations for instance 8ed18ae2-2ba1-424c-b695-846afd7b3501 [ 2041.228176] env[59577]: DEBUG oslo_concurrency.lockutils [None req-99e0d8bf-fa7e-47be-b0dd-f677958a052d tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Lock "8ed18ae2-2ba1-424c-b695-846afd7b3501" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 390.182s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 2041.228445] env[59577]: DEBUG oslo_concurrency.lockutils [None req-62a5f427-2b47-4a27-a315-d5791913b31b tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Lock "8ed18ae2-2ba1-424c-b695-846afd7b3501" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 193.604s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 2041.228700] env[59577]: DEBUG oslo_concurrency.lockutils [None req-62a5f427-2b47-4a27-a315-d5791913b31b tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Acquiring lock "8ed18ae2-2ba1-424c-b695-846afd7b3501-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 2041.228946] env[59577]: DEBUG oslo_concurrency.lockutils [None req-62a5f427-2b47-4a27-a315-d5791913b31b tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Lock "8ed18ae2-2ba1-424c-b695-846afd7b3501-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 2041.229152] env[59577]: DEBUG oslo_concurrency.lockutils [None req-62a5f427-2b47-4a27-a315-d5791913b31b tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Lock "8ed18ae2-2ba1-424c-b695-846afd7b3501-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 2041.231479] env[59577]: INFO nova.compute.manager [None req-62a5f427-2b47-4a27-a315-d5791913b31b tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Terminating instance [ 2041.233062] env[59577]: DEBUG oslo_concurrency.lockutils [None req-62a5f427-2b47-4a27-a315-d5791913b31b tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Acquiring lock "refresh_cache-8ed18ae2-2ba1-424c-b695-846afd7b3501" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 2041.233183] env[59577]: DEBUG oslo_concurrency.lockutils [None req-62a5f427-2b47-4a27-a315-d5791913b31b tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Acquired lock "refresh_cache-8ed18ae2-2ba1-424c-b695-846afd7b3501" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 2041.233605] env[59577]: DEBUG nova.network.neutron [None req-62a5f427-2b47-4a27-a315-d5791913b31b tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Building network info cache for instance {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2041.261550] env[59577]: DEBUG nova.network.neutron [None req-62a5f427-2b47-4a27-a315-d5791913b31b tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Instance cache missing network info. {{(pid=59577) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2041.404238] env[59577]: DEBUG nova.network.neutron [None req-62a5f427-2b47-4a27-a315-d5791913b31b tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Updating instance_info_cache with network_info: [] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2041.413365] env[59577]: DEBUG oslo_concurrency.lockutils [None req-62a5f427-2b47-4a27-a315-d5791913b31b tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Releasing lock "refresh_cache-8ed18ae2-2ba1-424c-b695-846afd7b3501" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 2041.413742] env[59577]: DEBUG nova.compute.manager [None req-62a5f427-2b47-4a27-a315-d5791913b31b tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Start destroying the instance on the hypervisor. {{(pid=59577) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 2041.413933] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-62a5f427-2b47-4a27-a315-d5791913b31b tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Destroying instance {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2041.414414] env[59577]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5351013c-ef74-4f20-9312-6adcc59707bc {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2041.423932] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0510de4d-9690-4408-ab17-6526715a136c {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2041.448399] env[59577]: WARNING nova.virt.vmwareapi.vmops [None req-62a5f427-2b47-4a27-a315-d5791913b31b tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8ed18ae2-2ba1-424c-b695-846afd7b3501 could not be found. [ 2041.448584] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-62a5f427-2b47-4a27-a315-d5791913b31b tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Instance destroyed {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2041.448754] env[59577]: INFO nova.compute.manager [None req-62a5f427-2b47-4a27-a315-d5791913b31b tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Took 0.03 seconds to destroy the instance on the hypervisor. [ 2041.448984] env[59577]: DEBUG oslo.service.loopingcall [None req-62a5f427-2b47-4a27-a315-d5791913b31b tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59577) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 2041.449202] env[59577]: DEBUG nova.compute.manager [-] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Deallocating network for instance {{(pid=59577) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 2041.449295] env[59577]: DEBUG nova.network.neutron [-] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] deallocate_for_instance() {{(pid=59577) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2041.464893] env[59577]: DEBUG nova.network.neutron [-] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Instance cache missing network info. {{(pid=59577) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2041.472683] env[59577]: DEBUG nova.network.neutron [-] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Updating instance_info_cache with network_info: [] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2041.480756] env[59577]: INFO nova.compute.manager [-] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] Took 0.03 seconds to deallocate network for instance. [ 2041.598165] env[59577]: DEBUG oslo_concurrency.lockutils [None req-62a5f427-2b47-4a27-a315-d5791913b31b tempest-AttachInterfacesUnderV243Test-1090741847 tempest-AttachInterfacesUnderV243Test-1090741847-project-member] Lock "8ed18ae2-2ba1-424c-b695-846afd7b3501" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.370s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 2041.598994] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "8ed18ae2-2ba1-424c-b695-846afd7b3501" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 160.282s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 2041.599206] env[59577]: INFO nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 8ed18ae2-2ba1-424c-b695-846afd7b3501] During sync_power_state the instance has a pending task (deleting). Skip. [ 2041.599437] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "8ed18ae2-2ba1-424c-b695-846afd7b3501" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 2042.044622] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 2042.044820] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Starting heal instance info cache {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 2042.044949] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Rebuilding the list of instances to heal {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 2042.062077] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 2042.062290] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 56259797-6883-437c-8942-5beca0e1ef7b] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 2042.062394] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 22339279-c381-4ccb-bba0-b0b554203e60] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 2042.062525] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 9c96e4d7-30d3-44fc-b8d0-14271dc19ce3] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 2042.063563] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: e2c0fb3c-6cee-4be4-a368-0fa86a07ff88] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 2042.063563] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: 3d63cb9b-3c20-4c34-a96e-29e3dcea65a7] Skipping network cache update for instance because it is Building. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 2042.063563] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Didn't find any instances for network info cache update. {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 2043.312660] env[59577]: DEBUG nova.compute.manager [req-231c7cc7-aeb1-4bd7-9374-e03c3668a6e7 req-b8030889-892d-43e7-8e0d-24729a089234 service nova] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Received event network-vif-deleted-ddd4e384-c685-4d4d-b892-53e96a0bb7b0 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 2043.544021] env[59577]: DEBUG nova.compute.manager [req-1e009935-4347-464d-8721-e003f54a8e2e req-c19444f5-3697-41d7-945d-1d8355dd685c service nova] [instance: 56259797-6883-437c-8942-5beca0e1ef7b] Received event network-vif-deleted-63f4eeb6-1ad0-4da2-a8ac-8ff233044606 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 2045.346306] env[59577]: DEBUG nova.compute.manager [req-0acf8904-0392-4a1d-a1e4-5b79894cc075 req-6efc91db-dc57-438a-8944-542231c93870 service nova] [instance: 22339279-c381-4ccb-bba0-b0b554203e60] Received event network-vif-deleted-c089d5d4-e798-41e1-bac8-058f3a1fe9de {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 2046.365952] env[59577]: DEBUG nova.compute.manager [req-fba39fae-454b-488d-a208-a6f6490fb288 req-ffab5753-64ab-42a0-9ece-b4cf12fb0ca6 service nova] [instance: 9c96e4d7-30d3-44fc-b8d0-14271dc19ce3] Received event network-vif-deleted-6b402131-56b2-48ab-bdc9-ab0e0d2a4709 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 2054.589734] env[59577]: DEBUG nova.compute.manager [req-09bcdcb0-7672-4a65-8c14-fd69acec0e88 req-2b2ee696-a226-4154-a303-a244740df2a1 service nova] [instance: 3d63cb9b-3c20-4c34-a96e-29e3dcea65a7] Received event network-vif-deleted-a67d68db-4bf8-431b-b399-abdf86da1d91 {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 2087.044377] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 2087.688892] env[59577]: WARNING oslo_vmware.rw_handles [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2087.688892] env[59577]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2087.688892] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2087.688892] env[59577]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2087.688892] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2087.688892] env[59577]: ERROR oslo_vmware.rw_handles response.begin() [ 2087.688892] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2087.688892] env[59577]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2087.688892] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2087.688892] env[59577]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2087.688892] env[59577]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2087.688892] env[59577]: ERROR oslo_vmware.rw_handles [ 2087.689317] env[59577]: DEBUG nova.virt.vmwareapi.images [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Downloaded image file data d5e691af-5903-46f6-a589-e220c4e5798c to vmware_temp/50661a85-9ff9-4634-aa2e-d8f3bb1a59ab/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk on the data store datastore1 {{(pid=59577) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2087.692173] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Caching image {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2087.692458] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Copying Virtual Disk [datastore1] vmware_temp/50661a85-9ff9-4634-aa2e-d8f3bb1a59ab/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk to [datastore1] vmware_temp/50661a85-9ff9-4634-aa2e-d8f3bb1a59ab/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk {{(pid=59577) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2087.692755] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-15163298-43a7-41ad-8c38-8787d647b1cc {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2087.700712] env[59577]: DEBUG oslo_vmware.api [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Waiting for the task: (returnval){ [ 2087.700712] env[59577]: value = "task-1933863" [ 2087.700712] env[59577]: _type = "Task" [ 2087.700712] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 2087.709007] env[59577]: DEBUG oslo_vmware.api [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Task: {'id': task-1933863, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 2088.211500] env[59577]: DEBUG oslo_vmware.exceptions [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Fault InvalidArgument not matched. {{(pid=59577) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 2088.211942] env[59577]: DEBUG oslo_concurrency.lockutils [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 2088.212317] env[59577]: ERROR nova.compute.manager [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2088.212317] env[59577]: Faults: ['InvalidArgument'] [ 2088.212317] env[59577]: ERROR nova.compute.manager [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Traceback (most recent call last): [ 2088.212317] env[59577]: ERROR nova.compute.manager [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 2088.212317] env[59577]: ERROR nova.compute.manager [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] yield resources [ 2088.212317] env[59577]: ERROR nova.compute.manager [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 2088.212317] env[59577]: ERROR nova.compute.manager [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] self.driver.spawn(context, instance, image_meta, [ 2088.212317] env[59577]: ERROR nova.compute.manager [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 2088.212317] env[59577]: ERROR nova.compute.manager [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2088.212317] env[59577]: ERROR nova.compute.manager [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2088.212317] env[59577]: ERROR nova.compute.manager [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] self._fetch_image_if_missing(context, vi) [ 2088.212317] env[59577]: ERROR nova.compute.manager [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2088.212873] env[59577]: ERROR nova.compute.manager [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] image_cache(vi, tmp_image_ds_loc) [ 2088.212873] env[59577]: ERROR nova.compute.manager [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2088.212873] env[59577]: ERROR nova.compute.manager [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] vm_util.copy_virtual_disk( [ 2088.212873] env[59577]: ERROR nova.compute.manager [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2088.212873] env[59577]: ERROR nova.compute.manager [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] session._wait_for_task(vmdk_copy_task) [ 2088.212873] env[59577]: ERROR nova.compute.manager [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2088.212873] env[59577]: ERROR nova.compute.manager [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] return self.wait_for_task(task_ref) [ 2088.212873] env[59577]: ERROR nova.compute.manager [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2088.212873] env[59577]: ERROR nova.compute.manager [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] return evt.wait() [ 2088.212873] env[59577]: ERROR nova.compute.manager [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 2088.212873] env[59577]: ERROR nova.compute.manager [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] result = hub.switch() [ 2088.212873] env[59577]: ERROR nova.compute.manager [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 2088.212873] env[59577]: ERROR nova.compute.manager [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] return self.greenlet.switch() [ 2088.213373] env[59577]: ERROR nova.compute.manager [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2088.213373] env[59577]: ERROR nova.compute.manager [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] self.f(*self.args, **self.kw) [ 2088.213373] env[59577]: ERROR nova.compute.manager [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2088.213373] env[59577]: ERROR nova.compute.manager [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] raise exceptions.translate_fault(task_info.error) [ 2088.213373] env[59577]: ERROR nova.compute.manager [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2088.213373] env[59577]: ERROR nova.compute.manager [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Faults: ['InvalidArgument'] [ 2088.213373] env[59577]: ERROR nova.compute.manager [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] [ 2088.213373] env[59577]: INFO nova.compute.manager [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Terminating instance [ 2088.214257] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 2088.214469] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2088.214694] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5e34a08d-1fa5-4e86-9aec-19f691dc1cd4 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2088.216844] env[59577]: DEBUG nova.compute.manager [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Start destroying the instance on the hypervisor. {{(pid=59577) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 2088.217050] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Destroying instance {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2088.217773] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92e08ca2-f3a0-4135-94fc-d8c5f93be5a1 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2088.224703] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Unregistering the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2088.224908] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4424a1bb-13f7-47dc-a9e4-73cbeba4cb11 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2088.227544] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2088.227715] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59577) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2088.228362] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-72abb934-3e41-469b-8f68-63cae50124a3 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2088.232709] env[59577]: DEBUG oslo_vmware.api [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Waiting for the task: (returnval){ [ 2088.232709] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]527568cb-4f65-fc68-76ce-d96a9bb6d4af" [ 2088.232709] env[59577]: _type = "Task" [ 2088.232709] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 2088.240279] env[59577]: DEBUG oslo_vmware.api [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]527568cb-4f65-fc68-76ce-d96a9bb6d4af, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 2088.298971] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Unregistered the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2088.299253] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Deleting contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2088.299331] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Deleting the datastore file [datastore1] aac8eec6-577b-46d2-9baa-8cf548a6970e {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2088.299578] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2dd348da-b5e7-43ee-919e-7437ec0fda82 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2088.305303] env[59577]: DEBUG oslo_vmware.api [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Waiting for the task: (returnval){ [ 2088.305303] env[59577]: value = "task-1933865" [ 2088.305303] env[59577]: _type = "Task" [ 2088.305303] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 2088.312783] env[59577]: DEBUG oslo_vmware.api [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Task: {'id': task-1933865, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 2088.743540] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Preparing fetch location {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2088.743801] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Creating directory with path [datastore1] vmware_temp/096fe7e2-8fb3-4bf1-aa5a-c22167d5472a/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2088.744032] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ec6fe8ef-4942-471a-922f-d0c88a118509 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2088.755136] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Created directory with path [datastore1] vmware_temp/096fe7e2-8fb3-4bf1-aa5a-c22167d5472a/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2088.755330] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Fetch image to [datastore1] vmware_temp/096fe7e2-8fb3-4bf1-aa5a-c22167d5472a/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2088.755497] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to [datastore1] vmware_temp/096fe7e2-8fb3-4bf1-aa5a-c22167d5472a/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk on the data store datastore1 {{(pid=59577) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2088.756204] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f243cd3-89bc-434d-96b9-378b3224fea2 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2088.762567] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-edfbc51d-e05b-474a-8da5-be76ee8e955a {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2088.771423] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d027d7b-16e0-431e-ac20-90db0f71741e {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2088.800528] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73b6dad3-6e98-423a-a143-21f73db14792 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2088.808179] env[59577]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5f2233dc-9f08-4fb5-b2e0-f3b0debfaea3 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2088.814166] env[59577]: DEBUG oslo_vmware.api [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Task: {'id': task-1933865, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.067538} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 2088.814390] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Deleted the datastore file {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2088.814567] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Deleted contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2088.814735] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Instance destroyed {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2088.814909] env[59577]: INFO nova.compute.manager [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2088.816995] env[59577]: DEBUG nova.compute.claims [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Aborting claim: {{(pid=59577) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2088.817176] env[59577]: DEBUG oslo_concurrency.lockutils [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 2088.817389] env[59577]: DEBUG oslo_concurrency.lockutils [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 2088.828112] env[59577]: DEBUG nova.virt.vmwareapi.images [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to the data store datastore1 {{(pid=59577) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2088.842959] env[59577]: DEBUG oslo_concurrency.lockutils [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.025s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 2088.843841] env[59577]: DEBUG nova.compute.utils [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Instance aac8eec6-577b-46d2-9baa-8cf548a6970e could not be found. {{(pid=59577) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2088.845067] env[59577]: DEBUG nova.compute.manager [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Instance disappeared during build. {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 2088.845246] env[59577]: DEBUG nova.compute.manager [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Unplugging VIFs for instance {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 2088.845409] env[59577]: DEBUG nova.compute.manager [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 2088.845579] env[59577]: DEBUG nova.compute.manager [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Deallocating network for instance {{(pid=59577) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 2088.845741] env[59577]: DEBUG nova.network.neutron [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] deallocate_for_instance() {{(pid=59577) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2088.871105] env[59577]: DEBUG nova.network.neutron [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Updating instance_info_cache with network_info: [] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2088.873596] env[59577]: DEBUG oslo_vmware.rw_handles [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/096fe7e2-8fb3-4bf1-aa5a-c22167d5472a/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59577) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 2088.927081] env[59577]: INFO nova.compute.manager [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] [instance: aac8eec6-577b-46d2-9baa-8cf548a6970e] Took 0.08 seconds to deallocate network for instance. [ 2088.931272] env[59577]: DEBUG oslo_vmware.rw_handles [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Completed reading data from the image iterator. {{(pid=59577) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 2088.931433] env[59577]: DEBUG oslo_vmware.rw_handles [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/096fe7e2-8fb3-4bf1-aa5a-c22167d5472a/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59577) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 2088.972048] env[59577]: DEBUG oslo_concurrency.lockutils [None req-82c1a669-3621-4787-b057-7de02f56ac44 tempest-ServerDiskConfigTestJSON-1060147408 tempest-ServerDiskConfigTestJSON-1060147408-project-member] Lock "aac8eec6-577b-46d2-9baa-8cf548a6970e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 339.905s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 2091.044308] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 2092.045150] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 2094.502948] env[59577]: DEBUG oslo_concurrency.lockutils [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Acquiring lock "f515af6b-b13a-4215-88df-681172342773" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 2094.503271] env[59577]: DEBUG oslo_concurrency.lockutils [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Lock "f515af6b-b13a-4215-88df-681172342773" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 2094.511989] env[59577]: DEBUG nova.compute.manager [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Starting instance... {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 2094.554140] env[59577]: DEBUG oslo_concurrency.lockutils [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 2094.554377] env[59577]: DEBUG oslo_concurrency.lockutils [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 2094.555772] env[59577]: INFO nova.compute.claims [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2094.622527] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf43db2a-5daf-4824-b42e-7060898eb0b6 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2094.630199] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8f18b70-04f8-437d-861f-181b7e2da047 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2094.658774] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea27e650-ce11-4f67-8964-2a8b6e724101 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2094.665547] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27c5736c-18ad-48fc-9e2f-f6708c0eafe3 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2094.679096] env[59577]: DEBUG nova.compute.provider_tree [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2094.689338] env[59577]: DEBUG nova.scheduler.client.report [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2094.701242] env[59577]: DEBUG oslo_concurrency.lockutils [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.147s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 2094.701578] env[59577]: DEBUG nova.compute.manager [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Start building networks asynchronously for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 2094.733426] env[59577]: DEBUG nova.compute.utils [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Using /dev/sd instead of None {{(pid=59577) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2094.734527] env[59577]: DEBUG nova.compute.manager [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Allocating IP information in the background. {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 2094.734697] env[59577]: DEBUG nova.network.neutron [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] allocate_for_instance() {{(pid=59577) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2094.744283] env[59577]: DEBUG nova.compute.manager [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Start building block device mappings for instance. {{(pid=59577) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 2094.774337] env[59577]: INFO nova.virt.block_device [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Booting with volume cae152a0-03b8-49e1-ba0e-a07439b21024 at /dev/sda [ 2094.790889] env[59577]: DEBUG nova.policy [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6ee8fea957fc4f34b247f39659a4f057', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1456ec94304f4a6e9389352afa682ace', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59577) authorize /opt/stack/nova/nova/policy.py:203}} [ 2094.815272] env[59577]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-7ec61bf1-e7d9-4e3a-b8fc-9ea0fece2da2 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2094.825455] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9f3858f-aed6-4c5a-b2bf-0d1ad401c15e {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2094.850778] env[59577]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-1294e6cc-7137-41ce-92b9-032a2fd33e57 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2094.857969] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a3ef12a-82da-480f-8d68-54eef959e68b {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2094.884054] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c8ba977-5fd5-4a1b-bdc7-23f884834dc3 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2094.890142] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3695dbe-1328-43ce-84dd-1afcb056130a {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2094.903252] env[59577]: DEBUG nova.virt.block_device [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Updating existing volume attachment record: a4538233-1679-4e52-af53-ba1495d7e2a4 {{(pid=59577) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} [ 2095.109191] env[59577]: DEBUG nova.compute.manager [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Start spawning the instance on the hypervisor. {{(pid=59577) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 2095.109736] env[59577]: DEBUG nova.virt.hardware [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-12-17T13:42:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format=,created_at=,direct_url=,disk_format=,id=,min_disk=0,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=,virtual_size=,visibility=), allow threads: False {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2095.109940] env[59577]: DEBUG nova.virt.hardware [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Flavor limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2095.110109] env[59577]: DEBUG nova.virt.hardware [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Image limits 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2095.110301] env[59577]: DEBUG nova.virt.hardware [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Flavor pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2095.110447] env[59577]: DEBUG nova.virt.hardware [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Image pref 0:0:0 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2095.110591] env[59577]: DEBUG nova.virt.hardware [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59577) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2095.110794] env[59577]: DEBUG nova.virt.hardware [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2095.110966] env[59577]: DEBUG nova.virt.hardware [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2095.111149] env[59577]: DEBUG nova.virt.hardware [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Got 1 possible topologies {{(pid=59577) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2095.111314] env[59577]: DEBUG nova.virt.hardware [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2095.111527] env[59577]: DEBUG nova.virt.hardware [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59577) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2095.112578] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c15b9df2-d46a-41d8-aac1-8a68bc8610a4 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2095.121703] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3717fb4-ad9d-418f-b703-c7520e1a6226 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2095.245074] env[59577]: DEBUG nova.network.neutron [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Successfully created port: 199687a9-cc02-408b-8e29-979f185dfcaa {{(pid=59577) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2095.691601] env[59577]: DEBUG nova.compute.manager [req-ac7ed207-ae83-4c32-bcd3-ec8374fe51f0 req-0f2bf257-3c53-47e6-b9cc-355b98de00ef service nova] [instance: f515af6b-b13a-4215-88df-681172342773] Received event network-vif-plugged-199687a9-cc02-408b-8e29-979f185dfcaa {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 2095.691841] env[59577]: DEBUG oslo_concurrency.lockutils [req-ac7ed207-ae83-4c32-bcd3-ec8374fe51f0 req-0f2bf257-3c53-47e6-b9cc-355b98de00ef service nova] Acquiring lock "f515af6b-b13a-4215-88df-681172342773-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 2095.692430] env[59577]: DEBUG oslo_concurrency.lockutils [req-ac7ed207-ae83-4c32-bcd3-ec8374fe51f0 req-0f2bf257-3c53-47e6-b9cc-355b98de00ef service nova] Lock "f515af6b-b13a-4215-88df-681172342773-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 2095.692613] env[59577]: DEBUG oslo_concurrency.lockutils [req-ac7ed207-ae83-4c32-bcd3-ec8374fe51f0 req-0f2bf257-3c53-47e6-b9cc-355b98de00ef service nova] Lock "f515af6b-b13a-4215-88df-681172342773-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 2095.692782] env[59577]: DEBUG nova.compute.manager [req-ac7ed207-ae83-4c32-bcd3-ec8374fe51f0 req-0f2bf257-3c53-47e6-b9cc-355b98de00ef service nova] [instance: f515af6b-b13a-4215-88df-681172342773] No waiting events found dispatching network-vif-plugged-199687a9-cc02-408b-8e29-979f185dfcaa {{(pid=59577) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2095.692960] env[59577]: WARNING nova.compute.manager [req-ac7ed207-ae83-4c32-bcd3-ec8374fe51f0 req-0f2bf257-3c53-47e6-b9cc-355b98de00ef service nova] [instance: f515af6b-b13a-4215-88df-681172342773] Received unexpected event network-vif-plugged-199687a9-cc02-408b-8e29-979f185dfcaa for instance with vm_state building and task_state spawning. [ 2095.763305] env[59577]: DEBUG nova.network.neutron [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Successfully updated port: 199687a9-cc02-408b-8e29-979f185dfcaa {{(pid=59577) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2095.773800] env[59577]: DEBUG oslo_concurrency.lockutils [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Acquiring lock "refresh_cache-f515af6b-b13a-4215-88df-681172342773" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 2095.773947] env[59577]: DEBUG oslo_concurrency.lockutils [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Acquired lock "refresh_cache-f515af6b-b13a-4215-88df-681172342773" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 2095.774122] env[59577]: DEBUG nova.network.neutron [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Building network info cache for instance {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2095.807637] env[59577]: DEBUG nova.network.neutron [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Instance cache missing network info. {{(pid=59577) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2095.958591] env[59577]: DEBUG nova.network.neutron [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Updating instance_info_cache with network_info: [{"id": "199687a9-cc02-408b-8e29-979f185dfcaa", "address": "fa:16:3e:4e:fa:c9", "network": {"id": "7fea00f3-c1de-4045-94c0-1eac139011a1", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-944177904-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1456ec94304f4a6e9389352afa682ace", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69744f59-ecac-4b0b-831e-82a274d7acbb", "external-id": "nsx-vlan-transportzone-770", "segmentation_id": 770, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap199687a9-cc", "ovs_interfaceid": "199687a9-cc02-408b-8e29-979f185dfcaa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2095.969547] env[59577]: DEBUG oslo_concurrency.lockutils [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Releasing lock "refresh_cache-f515af6b-b13a-4215-88df-681172342773" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 2095.969868] env[59577]: DEBUG nova.compute.manager [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Instance network_info: |[{"id": "199687a9-cc02-408b-8e29-979f185dfcaa", "address": "fa:16:3e:4e:fa:c9", "network": {"id": "7fea00f3-c1de-4045-94c0-1eac139011a1", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-944177904-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1456ec94304f4a6e9389352afa682ace", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69744f59-ecac-4b0b-831e-82a274d7acbb", "external-id": "nsx-vlan-transportzone-770", "segmentation_id": 770, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap199687a9-cc", "ovs_interfaceid": "199687a9-cc02-408b-8e29-979f185dfcaa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59577) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 2095.970250] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:4e:fa:c9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '69744f59-ecac-4b0b-831e-82a274d7acbb', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '199687a9-cc02-408b-8e29-979f185dfcaa', 'vif_model': 'vmxnet3'}] {{(pid=59577) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2095.977958] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Creating folder: Project (1456ec94304f4a6e9389352afa682ace). Parent ref: group-v398749. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2095.978440] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-addb2c2e-a04d-4b24-a6f4-387cfbabd2e3 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2095.992749] env[59577]: WARNING suds.client [-] Web service reported a SOAP processing fault using an unexpected HTTP status code 200. Reporting as an internal server error. [ 2095.992919] env[59577]: DEBUG oslo_vmware.api [-] Fault list: [DuplicateName] {{(pid=59577) _invoke_api /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:337}} [ 2095.993346] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Folder already exists: Project (1456ec94304f4a6e9389352afa682ace). Parent ref: group-v398749. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1599}} [ 2095.993535] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Creating folder: Instances. Parent ref: group-v398815. {{(pid=59577) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2095.993944] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5e13fb7f-92bd-49fa-9010-bb1093d752ce {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2096.003368] env[59577]: INFO nova.virt.vmwareapi.vm_util [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Created folder: Instances in parent group-v398815. [ 2096.003594] env[59577]: DEBUG oslo.service.loopingcall [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59577) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 2096.003768] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f515af6b-b13a-4215-88df-681172342773] Creating VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2096.003950] env[59577]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ccc09cff-aa61-4551-a214-e9f2d045e438 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2096.022659] env[59577]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2096.022659] env[59577]: value = "task-1933870" [ 2096.022659] env[59577]: _type = "Task" [ 2096.022659] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 2096.029882] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933870, 'name': CreateVM_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 2096.533691] env[59577]: DEBUG oslo_vmware.api [-] Task: {'id': task-1933870, 'name': CreateVM_Task, 'duration_secs': 0.445039} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 2096.533865] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f515af6b-b13a-4215-88df-681172342773] Created VM on the ESX host {{(pid=59577) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2096.534501] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Block device information present: {'root_device_name': '/dev/sda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'device_type': None, 'disk_bus': None, 'boot_index': 0, 'mount_device': '/dev/sda', 'attachment_id': 'a4538233-1679-4e52-af53-ba1495d7e2a4', 'connection_info': {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-398818', 'volume_id': 'cae152a0-03b8-49e1-ba0e-a07439b21024', 'name': 'volume-cae152a0-03b8-49e1-ba0e-a07439b21024', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'f515af6b-b13a-4215-88df-681172342773', 'attached_at': '', 'detached_at': '', 'volume_id': 'cae152a0-03b8-49e1-ba0e-a07439b21024', 'serial': 'cae152a0-03b8-49e1-ba0e-a07439b21024'}, 'delete_on_termination': True, 'guest_format': None, 'volume_type': None}], 'swap': None} {{(pid=59577) spawn /opt/stack/nova/nova/virt/vmwareapi/vmops.py:799}} [ 2096.534735] env[59577]: DEBUG nova.virt.vmwareapi.volumeops [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Root volume attach. Driver type: vmdk {{(pid=59577) attach_root_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:661}} [ 2096.535520] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-187bc7b0-5c6a-4536-b2f5-a1e73d0ef4d5 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2096.542996] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2cce601d-691d-4e34-ac00-ba6e1b939f38 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2096.548805] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebe89492-a21a-4590-89dc-d6172602874d {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2096.554605] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.RelocateVM_Task with opID=oslo.vmware-a2f37893-752f-430b-a2d2-1cd35b5826c2 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2096.561092] env[59577]: DEBUG oslo_vmware.api [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Waiting for the task: (returnval){ [ 2096.561092] env[59577]: value = "task-1933871" [ 2096.561092] env[59577]: _type = "Task" [ 2096.561092] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 2096.568125] env[59577]: DEBUG oslo_vmware.api [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Task: {'id': task-1933871, 'name': RelocateVM_Task} progress is 5%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 2097.040171] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 2097.071150] env[59577]: DEBUG oslo_vmware.api [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Task: {'id': task-1933871, 'name': RelocateVM_Task, 'duration_secs': 0.026787} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 2097.071442] env[59577]: DEBUG nova.virt.vmwareapi.volumeops [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Volume attach. Driver type: vmdk {{(pid=59577) attach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:439}} [ 2097.071676] env[59577]: DEBUG nova.virt.vmwareapi.volumeops [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] _attach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-398818', 'volume_id': 'cae152a0-03b8-49e1-ba0e-a07439b21024', 'name': 'volume-cae152a0-03b8-49e1-ba0e-a07439b21024', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'f515af6b-b13a-4215-88df-681172342773', 'attached_at': '', 'detached_at': '', 'volume_id': 'cae152a0-03b8-49e1-ba0e-a07439b21024', 'serial': 'cae152a0-03b8-49e1-ba0e-a07439b21024'} {{(pid=59577) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:336}} [ 2097.072437] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e006e86-9404-43e3-b82b-4813480bbfa6 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2097.088429] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-945a4673-55b5-442c-a01e-08d3c29286c6 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2097.109717] env[59577]: DEBUG nova.virt.vmwareapi.volumeops [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Reconfiguring VM instance instance-00000021 to attach disk [datastore1] volume-cae152a0-03b8-49e1-ba0e-a07439b21024/volume-cae152a0-03b8-49e1-ba0e-a07439b21024.vmdk or device None with type thin {{(pid=59577) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:81}} [ 2097.109999] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-ae38ba7c-4eeb-4d8d-92f1-4f02bc8d1886 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2097.129028] env[59577]: DEBUG oslo_vmware.api [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Waiting for the task: (returnval){ [ 2097.129028] env[59577]: value = "task-1933872" [ 2097.129028] env[59577]: _type = "Task" [ 2097.129028] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 2097.136640] env[59577]: DEBUG oslo_vmware.api [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Task: {'id': task-1933872, 'name': ReconfigVM_Task} progress is 5%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 2097.638480] env[59577]: DEBUG oslo_vmware.api [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Task: {'id': task-1933872, 'name': ReconfigVM_Task, 'duration_secs': 0.270871} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 2097.638743] env[59577]: DEBUG nova.virt.vmwareapi.volumeops [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Reconfigured VM instance instance-00000021 to attach disk [datastore1] volume-cae152a0-03b8-49e1-ba0e-a07439b21024/volume-cae152a0-03b8-49e1-ba0e-a07439b21024.vmdk or device None with type thin {{(pid=59577) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:88}} [ 2097.643323] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-f3c62a59-b0fd-4c67-82c5-4555531482cf {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2097.657966] env[59577]: DEBUG oslo_vmware.api [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Waiting for the task: (returnval){ [ 2097.657966] env[59577]: value = "task-1933873" [ 2097.657966] env[59577]: _type = "Task" [ 2097.657966] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 2097.665979] env[59577]: DEBUG oslo_vmware.api [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Task: {'id': task-1933873, 'name': ReconfigVM_Task} progress is 5%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 2097.719257] env[59577]: DEBUG nova.compute.manager [req-21721ca2-b62d-4da4-88bd-c009da600c84 req-60916792-3579-41d6-a5c8-cce885264a35 service nova] [instance: f515af6b-b13a-4215-88df-681172342773] Received event network-changed-199687a9-cc02-408b-8e29-979f185dfcaa {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 2097.719418] env[59577]: DEBUG nova.compute.manager [req-21721ca2-b62d-4da4-88bd-c009da600c84 req-60916792-3579-41d6-a5c8-cce885264a35 service nova] [instance: f515af6b-b13a-4215-88df-681172342773] Refreshing instance network info cache due to event network-changed-199687a9-cc02-408b-8e29-979f185dfcaa. {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 2097.719646] env[59577]: DEBUG oslo_concurrency.lockutils [req-21721ca2-b62d-4da4-88bd-c009da600c84 req-60916792-3579-41d6-a5c8-cce885264a35 service nova] Acquiring lock "refresh_cache-f515af6b-b13a-4215-88df-681172342773" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 2097.719800] env[59577]: DEBUG oslo_concurrency.lockutils [req-21721ca2-b62d-4da4-88bd-c009da600c84 req-60916792-3579-41d6-a5c8-cce885264a35 service nova] Acquired lock "refresh_cache-f515af6b-b13a-4215-88df-681172342773" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 2097.719954] env[59577]: DEBUG nova.network.neutron [req-21721ca2-b62d-4da4-88bd-c009da600c84 req-60916792-3579-41d6-a5c8-cce885264a35 service nova] [instance: f515af6b-b13a-4215-88df-681172342773] Refreshing network info cache for port 199687a9-cc02-408b-8e29-979f185dfcaa {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2097.953845] env[59577]: DEBUG nova.network.neutron [req-21721ca2-b62d-4da4-88bd-c009da600c84 req-60916792-3579-41d6-a5c8-cce885264a35 service nova] [instance: f515af6b-b13a-4215-88df-681172342773] Updated VIF entry in instance network info cache for port 199687a9-cc02-408b-8e29-979f185dfcaa. {{(pid=59577) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2097.954209] env[59577]: DEBUG nova.network.neutron [req-21721ca2-b62d-4da4-88bd-c009da600c84 req-60916792-3579-41d6-a5c8-cce885264a35 service nova] [instance: f515af6b-b13a-4215-88df-681172342773] Updating instance_info_cache with network_info: [{"id": "199687a9-cc02-408b-8e29-979f185dfcaa", "address": "fa:16:3e:4e:fa:c9", "network": {"id": "7fea00f3-c1de-4045-94c0-1eac139011a1", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-944177904-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1456ec94304f4a6e9389352afa682ace", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69744f59-ecac-4b0b-831e-82a274d7acbb", "external-id": "nsx-vlan-transportzone-770", "segmentation_id": 770, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap199687a9-cc", "ovs_interfaceid": "199687a9-cc02-408b-8e29-979f185dfcaa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2097.963170] env[59577]: DEBUG oslo_concurrency.lockutils [req-21721ca2-b62d-4da4-88bd-c009da600c84 req-60916792-3579-41d6-a5c8-cce885264a35 service nova] Releasing lock "refresh_cache-f515af6b-b13a-4215-88df-681172342773" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 2098.043944] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 2098.044328] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 2098.044328] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59577) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 2098.044472] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 2098.054154] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 2098.054371] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 2098.054541] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 2098.054703] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59577) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 2098.055784] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc2c832c-4310-4ff0-84d5-c6b47d203287 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2098.066119] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa0ec231-100f-46c6-8af1-87e4ba5630d0 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2098.080134] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd93dd39-9f5f-4eb6-b8d9-edf5edde015b {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2098.086646] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8723417c-ac83-4e2e-9793-e612702044bc {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2098.115299] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181302MB free_disk=175GB free_vcpus=48 pci_devices=None {{(pid=59577) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 2098.115448] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 2098.115637] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 2098.150746] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Instance f515af6b-b13a-4215-88df-681172342773 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59577) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 2098.150956] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 2098.151127] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=59577) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 2098.169340] env[59577]: DEBUG oslo_vmware.api [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Task: {'id': task-1933873, 'name': ReconfigVM_Task, 'duration_secs': 0.428328} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 2098.169623] env[59577]: DEBUG nova.virt.vmwareapi.volumeops [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Attached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-398818', 'volume_id': 'cae152a0-03b8-49e1-ba0e-a07439b21024', 'name': 'volume-cae152a0-03b8-49e1-ba0e-a07439b21024', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'f515af6b-b13a-4215-88df-681172342773', 'attached_at': '', 'detached_at': '', 'volume_id': 'cae152a0-03b8-49e1-ba0e-a07439b21024', 'serial': 'cae152a0-03b8-49e1-ba0e-a07439b21024'} {{(pid=59577) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:361}} [ 2098.170215] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.Rename_Task with opID=oslo.vmware-1a98abe1-2edd-48a6-929c-34de7b008f53 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2098.177064] env[59577]: DEBUG oslo_vmware.api [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Waiting for the task: (returnval){ [ 2098.177064] env[59577]: value = "task-1933874" [ 2098.177064] env[59577]: _type = "Task" [ 2098.177064] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 2098.178535] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28de8793-d31e-48ae-97e1-872cb2e3d2be {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2098.190536] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-295421f9-6f61-439c-9edd-3bee58672dd7 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2098.193477] env[59577]: DEBUG oslo_vmware.api [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Task: {'id': task-1933874, 'name': Rename_Task} progress is 5%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 2098.221514] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb79d54b-0c97-44ae-896d-4951dc5e4b90 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2098.228545] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a8245f2-854f-46b1-90f6-f710675b93e5 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2098.241516] env[59577]: DEBUG nova.compute.provider_tree [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2098.249268] env[59577]: DEBUG nova.scheduler.client.report [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2098.261603] env[59577]: DEBUG nova.compute.resource_tracker [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59577) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 2098.261765] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.146s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 2098.687300] env[59577]: DEBUG oslo_vmware.api [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Task: {'id': task-1933874, 'name': Rename_Task} progress is 14%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 2099.188162] env[59577]: DEBUG oslo_vmware.api [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Task: {'id': task-1933874, 'name': Rename_Task} progress is 14%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 2099.689465] env[59577]: DEBUG oslo_vmware.api [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Task: {'id': task-1933874, 'name': Rename_Task} progress is 99%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 2100.190164] env[59577]: DEBUG oslo_vmware.api [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Task: {'id': task-1933874, 'name': Rename_Task} progress is 99%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 2100.691291] env[59577]: DEBUG oslo_vmware.api [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Task: {'id': task-1933874, 'name': Rename_Task, 'duration_secs': 2.220241} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 2100.691645] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Powering on the VM {{(pid=59577) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1442}} [ 2100.691971] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOnVM_Task with opID=oslo.vmware-2dbef324-59a7-4e83-9110-becb8d371258 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2100.698517] env[59577]: DEBUG oslo_vmware.api [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Waiting for the task: (returnval){ [ 2100.698517] env[59577]: value = "task-1933875" [ 2100.698517] env[59577]: _type = "Task" [ 2100.698517] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 2100.705871] env[59577]: DEBUG oslo_vmware.api [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Task: {'id': task-1933875, 'name': PowerOnVM_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 2101.208160] env[59577]: DEBUG oslo_vmware.api [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Task: {'id': task-1933875, 'name': PowerOnVM_Task, 'duration_secs': 0.425572} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 2101.208519] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Powered on the VM {{(pid=59577) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1448}} [ 2101.208651] env[59577]: INFO nova.compute.manager [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Took 6.10 seconds to spawn the instance on the hypervisor. [ 2101.208862] env[59577]: DEBUG nova.compute.manager [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Checking state {{(pid=59577) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} [ 2101.209633] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f69a0121-57ce-4da6-b2e3-5f15a6f96908 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2101.254773] env[59577]: INFO nova.compute.manager [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Took 6.71 seconds to build instance. [ 2101.262397] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 2101.272576] env[59577]: DEBUG oslo_concurrency.lockutils [None req-082dca49-faba-4e07-bd4f-4df6c63bf0a7 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Lock "f515af6b-b13a-4215-88df-681172342773" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.769s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 2102.045044] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 2102.045044] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Starting heal instance info cache {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 2102.045044] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Rebuilding the list of instances to heal {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 2102.093745] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquiring lock "refresh_cache-f515af6b-b13a-4215-88df-681172342773" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 2102.093896] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Acquired lock "refresh_cache-f515af6b-b13a-4215-88df-681172342773" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 2102.094066] env[59577]: DEBUG nova.network.neutron [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: f515af6b-b13a-4215-88df-681172342773] Forcefully refreshing network info cache for instance {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2004}} [ 2102.094254] env[59577]: DEBUG nova.objects.instance [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Lazy-loading 'info_cache' on Instance uuid f515af6b-b13a-4215-88df-681172342773 {{(pid=59577) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 2102.357029] env[59577]: DEBUG nova.network.neutron [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: f515af6b-b13a-4215-88df-681172342773] Updating instance_info_cache with network_info: [{"id": "199687a9-cc02-408b-8e29-979f185dfcaa", "address": "fa:16:3e:4e:fa:c9", "network": {"id": "7fea00f3-c1de-4045-94c0-1eac139011a1", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-944177904-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1456ec94304f4a6e9389352afa682ace", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69744f59-ecac-4b0b-831e-82a274d7acbb", "external-id": "nsx-vlan-transportzone-770", "segmentation_id": 770, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap199687a9-cc", "ovs_interfaceid": "199687a9-cc02-408b-8e29-979f185dfcaa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2102.365935] env[59577]: DEBUG oslo_concurrency.lockutils [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Releasing lock "refresh_cache-f515af6b-b13a-4215-88df-681172342773" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 2102.366143] env[59577]: DEBUG nova.compute.manager [None req-7dec82ad-57db-4310-a42d-539968646519 None None] [instance: f515af6b-b13a-4215-88df-681172342773] Updated the network info_cache for instance {{(pid=59577) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9885}} [ 2102.871917] env[59577]: DEBUG nova.compute.manager [req-f1a39e6f-4a64-486d-93d7-c4252f95f481 req-8fbf1f9b-9b3e-4162-b9b3-479dbdb29874 service nova] [instance: f515af6b-b13a-4215-88df-681172342773] Received event network-changed-199687a9-cc02-408b-8e29-979f185dfcaa {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 2102.872197] env[59577]: DEBUG nova.compute.manager [req-f1a39e6f-4a64-486d-93d7-c4252f95f481 req-8fbf1f9b-9b3e-4162-b9b3-479dbdb29874 service nova] [instance: f515af6b-b13a-4215-88df-681172342773] Refreshing instance network info cache due to event network-changed-199687a9-cc02-408b-8e29-979f185dfcaa. {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 2102.872352] env[59577]: DEBUG oslo_concurrency.lockutils [req-f1a39e6f-4a64-486d-93d7-c4252f95f481 req-8fbf1f9b-9b3e-4162-b9b3-479dbdb29874 service nova] Acquiring lock "refresh_cache-f515af6b-b13a-4215-88df-681172342773" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 2102.872517] env[59577]: DEBUG oslo_concurrency.lockutils [req-f1a39e6f-4a64-486d-93d7-c4252f95f481 req-8fbf1f9b-9b3e-4162-b9b3-479dbdb29874 service nova] Acquired lock "refresh_cache-f515af6b-b13a-4215-88df-681172342773" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 2102.872642] env[59577]: DEBUG nova.network.neutron [req-f1a39e6f-4a64-486d-93d7-c4252f95f481 req-8fbf1f9b-9b3e-4162-b9b3-479dbdb29874 service nova] [instance: f515af6b-b13a-4215-88df-681172342773] Refreshing network info cache for port 199687a9-cc02-408b-8e29-979f185dfcaa {{(pid=59577) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2103.162272] env[59577]: DEBUG nova.network.neutron [req-f1a39e6f-4a64-486d-93d7-c4252f95f481 req-8fbf1f9b-9b3e-4162-b9b3-479dbdb29874 service nova] [instance: f515af6b-b13a-4215-88df-681172342773] Updated VIF entry in instance network info cache for port 199687a9-cc02-408b-8e29-979f185dfcaa. {{(pid=59577) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2103.162336] env[59577]: DEBUG nova.network.neutron [req-f1a39e6f-4a64-486d-93d7-c4252f95f481 req-8fbf1f9b-9b3e-4162-b9b3-479dbdb29874 service nova] [instance: f515af6b-b13a-4215-88df-681172342773] Updating instance_info_cache with network_info: [{"id": "199687a9-cc02-408b-8e29-979f185dfcaa", "address": "fa:16:3e:4e:fa:c9", "network": {"id": "7fea00f3-c1de-4045-94c0-1eac139011a1", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-944177904-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "10.180.180.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1456ec94304f4a6e9389352afa682ace", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69744f59-ecac-4b0b-831e-82a274d7acbb", "external-id": "nsx-vlan-transportzone-770", "segmentation_id": 770, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap199687a9-cc", "ovs_interfaceid": "199687a9-cc02-408b-8e29-979f185dfcaa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2103.171734] env[59577]: DEBUG oslo_concurrency.lockutils [req-f1a39e6f-4a64-486d-93d7-c4252f95f481 req-8fbf1f9b-9b3e-4162-b9b3-479dbdb29874 service nova] Releasing lock "refresh_cache-f515af6b-b13a-4215-88df-681172342773" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 2115.361651] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 2121.541214] env[59577]: INFO nova.compute.manager [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Rebuilding instance [ 2121.573073] env[59577]: DEBUG nova.objects.instance [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Lazy-loading 'trusted_certs' on Instance uuid f515af6b-b13a-4215-88df-681172342773 {{(pid=59577) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 2121.585403] env[59577]: DEBUG nova.compute.manager [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Checking state {{(pid=59577) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} [ 2121.586301] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff3dd92d-832e-46c3-ad4f-56288a3716f9 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2121.623190] env[59577]: DEBUG nova.objects.instance [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Lazy-loading 'pci_requests' on Instance uuid f515af6b-b13a-4215-88df-681172342773 {{(pid=59577) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 2121.632349] env[59577]: DEBUG nova.objects.instance [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Lazy-loading 'pci_devices' on Instance uuid f515af6b-b13a-4215-88df-681172342773 {{(pid=59577) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 2121.639436] env[59577]: DEBUG nova.objects.instance [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Lazy-loading 'resources' on Instance uuid f515af6b-b13a-4215-88df-681172342773 {{(pid=59577) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 2121.646465] env[59577]: DEBUG nova.objects.instance [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Lazy-loading 'migration_context' on Instance uuid f515af6b-b13a-4215-88df-681172342773 {{(pid=59577) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 2121.652287] env[59577]: DEBUG nova.objects.instance [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Trying to apply a migration context that does not seem to be set for this instance {{(pid=59577) apply_migration_context /opt/stack/nova/nova/objects/instance.py:1032}} [ 2121.652671] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Powering off the VM {{(pid=59577) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1502}} [ 2121.652916] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOffVM_Task with opID=oslo.vmware-09ea0188-4d09-4449-a751-e29496d67511 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2121.660194] env[59577]: DEBUG oslo_vmware.api [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Waiting for the task: (returnval){ [ 2121.660194] env[59577]: value = "task-1933876" [ 2121.660194] env[59577]: _type = "Task" [ 2121.660194] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 2121.668409] env[59577]: DEBUG oslo_vmware.api [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Task: {'id': task-1933876, 'name': PowerOffVM_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 2122.170686] env[59577]: DEBUG oslo_vmware.api [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Task: {'id': task-1933876, 'name': PowerOffVM_Task, 'duration_secs': 0.18131} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 2122.170973] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Powered off the VM {{(pid=59577) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1507}} [ 2122.171635] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Powering off the VM {{(pid=59577) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1502}} [ 2122.171888] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOffVM_Task with opID=oslo.vmware-cae42c04-dc25-4e16-b379-d4cfdbb95a8f {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2122.178213] env[59577]: DEBUG oslo_vmware.api [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Waiting for the task: (returnval){ [ 2122.178213] env[59577]: value = "task-1933877" [ 2122.178213] env[59577]: _type = "Task" [ 2122.178213] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 2122.185330] env[59577]: DEBUG oslo_vmware.api [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Task: {'id': task-1933877, 'name': PowerOffVM_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 2122.688123] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] VM already powered off {{(pid=59577) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1509}} [ 2122.688512] env[59577]: DEBUG nova.virt.vmwareapi.volumeops [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Volume detach. Driver type: vmdk {{(pid=59577) detach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:646}} [ 2122.688512] env[59577]: DEBUG nova.virt.vmwareapi.volumeops [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] _detach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-398818', 'volume_id': 'cae152a0-03b8-49e1-ba0e-a07439b21024', 'name': 'volume-cae152a0-03b8-49e1-ba0e-a07439b21024', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'f515af6b-b13a-4215-88df-681172342773', 'attached_at': '', 'detached_at': '', 'volume_id': 'cae152a0-03b8-49e1-ba0e-a07439b21024', 'serial': 'cae152a0-03b8-49e1-ba0e-a07439b21024'} {{(pid=59577) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:571}} [ 2122.689274] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1cf2e3d1-7949-483b-a4e3-688e3887b1e7 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2122.706619] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69abe119-51ca-4eea-a222-45d7835989a7 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2122.712964] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df209978-e954-413c-87e3-81a773d1f409 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2122.729682] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4fd4ec22-04a9-4ca4-bc82-588b34c35808 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2122.743909] env[59577]: DEBUG nova.virt.vmwareapi.volumeops [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] The volume has not been displaced from its original location: [datastore1] volume-cae152a0-03b8-49e1-ba0e-a07439b21024/volume-cae152a0-03b8-49e1-ba0e-a07439b21024.vmdk. No consolidation needed. {{(pid=59577) _consolidate_vmdk_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:504}} [ 2122.749113] env[59577]: DEBUG nova.virt.vmwareapi.volumeops [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Reconfiguring VM instance instance-00000021 to detach disk 2000 {{(pid=59577) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:122}} [ 2122.749403] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-ac4cd4ea-1b81-46e2-8ed1-458712c8b07c {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2122.766724] env[59577]: DEBUG oslo_vmware.api [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Waiting for the task: (returnval){ [ 2122.766724] env[59577]: value = "task-1933878" [ 2122.766724] env[59577]: _type = "Task" [ 2122.766724] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 2122.775698] env[59577]: DEBUG oslo_vmware.api [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Task: {'id': task-1933878, 'name': ReconfigVM_Task} progress is 5%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 2123.280014] env[59577]: DEBUG oslo_vmware.api [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Task: {'id': task-1933878, 'name': ReconfigVM_Task, 'duration_secs': 0.17454} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 2123.280395] env[59577]: DEBUG nova.virt.vmwareapi.volumeops [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Reconfigured VM instance instance-00000021 to detach disk 2000 {{(pid=59577) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:127}} [ 2123.287634] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-8ed2d0e8-280c-4d00-b2c7-73e5a5cb45a4 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2123.308263] env[59577]: DEBUG oslo_vmware.api [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Waiting for the task: (returnval){ [ 2123.308263] env[59577]: value = "task-1933879" [ 2123.308263] env[59577]: _type = "Task" [ 2123.308263] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 2123.318474] env[59577]: DEBUG oslo_vmware.api [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Task: {'id': task-1933879, 'name': ReconfigVM_Task} progress is 6%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 2123.818194] env[59577]: DEBUG oslo_vmware.api [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Task: {'id': task-1933879, 'name': ReconfigVM_Task, 'duration_secs': 0.114846} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 2123.818604] env[59577]: DEBUG nova.virt.vmwareapi.volumeops [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Detached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-398818', 'volume_id': 'cae152a0-03b8-49e1-ba0e-a07439b21024', 'name': 'volume-cae152a0-03b8-49e1-ba0e-a07439b21024', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'f515af6b-b13a-4215-88df-681172342773', 'attached_at': '', 'detached_at': '', 'volume_id': 'cae152a0-03b8-49e1-ba0e-a07439b21024', 'serial': 'cae152a0-03b8-49e1-ba0e-a07439b21024'} {{(pid=59577) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:605}} [ 2123.818774] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Destroying instance {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2123.819434] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13719954-a827-4eb3-9372-44cf031b0817 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2123.825519] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Unregistering the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2123.825719] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-323a1845-8116-4af2-948d-d303952194ff {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2123.900250] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Unregistered the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2123.900526] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Deleting contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2123.900737] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Deleting the datastore file [datastore1] f515af6b-b13a-4215-88df-681172342773 {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2123.901054] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f35c6498-8587-4b16-ad4d-cbbf8b2afd49 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2123.907587] env[59577]: DEBUG oslo_vmware.api [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Waiting for the task: (returnval){ [ 2123.907587] env[59577]: value = "task-1933881" [ 2123.907587] env[59577]: _type = "Task" [ 2123.907587] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 2123.914981] env[59577]: DEBUG oslo_vmware.api [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Task: {'id': task-1933881, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 2124.417603] env[59577]: DEBUG oslo_vmware.api [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Task: {'id': task-1933881, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.080649} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 2124.417939] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Deleted the datastore file {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2124.418067] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Deleted contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2124.418173] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Instance destroyed {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2124.466061] env[59577]: DEBUG nova.virt.vmwareapi.volumeops [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Volume detach. Driver type: vmdk {{(pid=59577) detach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:646}} [ 2124.466373] env[59577]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-74cc4a3b-668e-4604-a63b-b8a1374b3cb6 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2124.474904] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60c87229-4d68-4926-a6d1-6bc27d93e16f {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2124.500351] env[59577]: ERROR nova.compute.manager [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Failed to detach volume cae152a0-03b8-49e1-ba0e-a07439b21024 from /dev/sda: nova.exception.InstanceNotFound: Instance f515af6b-b13a-4215-88df-681172342773 could not be found. [ 2124.500351] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] Traceback (most recent call last): [ 2124.500351] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] File "/opt/stack/nova/nova/compute/manager.py", line 4100, in _do_rebuild_instance [ 2124.500351] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] self.driver.rebuild(**kwargs) [ 2124.500351] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] File "/opt/stack/nova/nova/virt/driver.py", line 378, in rebuild [ 2124.500351] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] raise NotImplementedError() [ 2124.500351] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] NotImplementedError [ 2124.500351] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] [ 2124.500351] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] During handling of the above exception, another exception occurred: [ 2124.500351] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] [ 2124.500351] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] Traceback (most recent call last): [ 2124.500351] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] File "/opt/stack/nova/nova/compute/manager.py", line 3535, in _detach_root_volume [ 2124.500351] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] self.driver.detach_volume(context, old_connection_info, [ 2124.500351] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 542, in detach_volume [ 2124.500351] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] return self._volumeops.detach_volume(connection_info, instance) [ 2124.500351] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 649, in detach_volume [ 2124.500351] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] self._detach_volume_vmdk(connection_info, instance) [ 2124.500351] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 569, in _detach_volume_vmdk [ 2124.500351] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] vm_ref = vm_util.get_vm_ref(self._session, instance) [ 2124.500351] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1135, in get_vm_ref [ 2124.500351] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] stable_ref.fetch_moref(session) [ 2124.500351] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1126, in fetch_moref [ 2124.500351] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] raise exception.InstanceNotFound(instance_id=self._uuid) [ 2124.500351] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] nova.exception.InstanceNotFound: Instance f515af6b-b13a-4215-88df-681172342773 could not be found. [ 2124.500351] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] [ 2124.618289] env[59577]: DEBUG nova.compute.utils [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Build of instance f515af6b-b13a-4215-88df-681172342773 aborted: Failed to rebuild volume backed instance. {{(pid=59577) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2124.620677] env[59577]: ERROR nova.compute.manager [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Setting instance vm_state to ERROR: nova.exception.BuildAbortException: Build of instance f515af6b-b13a-4215-88df-681172342773 aborted: Failed to rebuild volume backed instance. [ 2124.620677] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] Traceback (most recent call last): [ 2124.620677] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] File "/opt/stack/nova/nova/compute/manager.py", line 4100, in _do_rebuild_instance [ 2124.620677] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] self.driver.rebuild(**kwargs) [ 2124.620677] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] File "/opt/stack/nova/nova/virt/driver.py", line 378, in rebuild [ 2124.620677] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] raise NotImplementedError() [ 2124.620677] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] NotImplementedError [ 2124.620677] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] [ 2124.620677] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] During handling of the above exception, another exception occurred: [ 2124.620677] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] [ 2124.620677] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] Traceback (most recent call last): [ 2124.620677] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] File "/opt/stack/nova/nova/compute/manager.py", line 3570, in _rebuild_volume_backed_instance [ 2124.620677] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] self._detach_root_volume(context, instance, root_bdm) [ 2124.620677] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] File "/opt/stack/nova/nova/compute/manager.py", line 3549, in _detach_root_volume [ 2124.620677] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] with excutils.save_and_reraise_exception(): [ 2124.620677] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2124.620677] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] self.force_reraise() [ 2124.620677] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2124.620677] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] raise self.value [ 2124.620677] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] File "/opt/stack/nova/nova/compute/manager.py", line 3535, in _detach_root_volume [ 2124.620677] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] self.driver.detach_volume(context, old_connection_info, [ 2124.620677] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 542, in detach_volume [ 2124.620677] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] return self._volumeops.detach_volume(connection_info, instance) [ 2124.620677] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 649, in detach_volume [ 2124.620677] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] self._detach_volume_vmdk(connection_info, instance) [ 2124.620677] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 569, in _detach_volume_vmdk [ 2124.620677] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] vm_ref = vm_util.get_vm_ref(self._session, instance) [ 2124.620677] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1135, in get_vm_ref [ 2124.620677] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] stable_ref.fetch_moref(session) [ 2124.620677] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1126, in fetch_moref [ 2124.620677] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] raise exception.InstanceNotFound(instance_id=self._uuid) [ 2124.620677] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] nova.exception.InstanceNotFound: Instance f515af6b-b13a-4215-88df-681172342773 could not be found. [ 2124.620677] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] [ 2124.620677] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] During handling of the above exception, another exception occurred: [ 2124.621991] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] [ 2124.621991] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] Traceback (most recent call last): [ 2124.621991] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] File "/opt/stack/nova/nova/compute/manager.py", line 10738, in _error_out_instance_on_exception [ 2124.621991] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] yield [ 2124.621991] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] File "/opt/stack/nova/nova/compute/manager.py", line 3826, in rebuild_instance [ 2124.621991] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] self._do_rebuild_instance_with_claim( [ 2124.621991] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] File "/opt/stack/nova/nova/compute/manager.py", line 3912, in _do_rebuild_instance_with_claim [ 2124.621991] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] self._do_rebuild_instance( [ 2124.621991] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] File "/opt/stack/nova/nova/compute/manager.py", line 4104, in _do_rebuild_instance [ 2124.621991] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] self._rebuild_default_impl(**kwargs) [ 2124.621991] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] File "/opt/stack/nova/nova/compute/manager.py", line 3693, in _rebuild_default_impl [ 2124.621991] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] self._rebuild_volume_backed_instance( [ 2124.621991] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] File "/opt/stack/nova/nova/compute/manager.py", line 3585, in _rebuild_volume_backed_instance [ 2124.621991] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] raise exception.BuildAbortException( [ 2124.621991] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] nova.exception.BuildAbortException: Build of instance f515af6b-b13a-4215-88df-681172342773 aborted: Failed to rebuild volume backed instance. [ 2124.621991] env[59577]: ERROR nova.compute.manager [instance: f515af6b-b13a-4215-88df-681172342773] [ 2124.703194] env[59577]: DEBUG oslo_concurrency.lockutils [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 2124.703395] env[59577]: DEBUG oslo_concurrency.lockutils [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 2124.716792] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7895a05-689e-4fa9-9236-1c7aabc6363a {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2124.724406] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c822295-7735-4fb3-8aec-7b699f023963 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2124.754500] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0ece792-60e6-4394-8287-913324aad559 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2124.761353] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d64698f-751f-40e0-a6f8-1d75e4782938 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2124.773950] env[59577]: DEBUG nova.compute.provider_tree [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2124.782024] env[59577]: DEBUG nova.scheduler.client.report [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2124.794202] env[59577]: DEBUG oslo_concurrency.lockutils [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.091s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 2124.794389] env[59577]: INFO nova.compute.manager [None req-71d006b1-6424-473c-94d0-38b6915d1428 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Successfully reverted task state from rebuilding on failure for instance. [ 2125.153798] env[59577]: DEBUG oslo_concurrency.lockutils [None req-defad58e-8b91-40ff-a486-9e79c39904f9 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Acquiring lock "f515af6b-b13a-4215-88df-681172342773" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 2125.154090] env[59577]: DEBUG oslo_concurrency.lockutils [None req-defad58e-8b91-40ff-a486-9e79c39904f9 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Lock "f515af6b-b13a-4215-88df-681172342773" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 2125.154260] env[59577]: DEBUG oslo_concurrency.lockutils [None req-defad58e-8b91-40ff-a486-9e79c39904f9 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Acquiring lock "f515af6b-b13a-4215-88df-681172342773-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 2125.154482] env[59577]: DEBUG oslo_concurrency.lockutils [None req-defad58e-8b91-40ff-a486-9e79c39904f9 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Lock "f515af6b-b13a-4215-88df-681172342773-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 2125.154596] env[59577]: DEBUG oslo_concurrency.lockutils [None req-defad58e-8b91-40ff-a486-9e79c39904f9 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Lock "f515af6b-b13a-4215-88df-681172342773-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 2125.156826] env[59577]: INFO nova.compute.manager [None req-defad58e-8b91-40ff-a486-9e79c39904f9 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Terminating instance [ 2125.158558] env[59577]: DEBUG nova.compute.manager [None req-defad58e-8b91-40ff-a486-9e79c39904f9 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Start destroying the instance on the hypervisor. {{(pid=59577) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 2125.159073] env[59577]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-3ecdde73-fcb0-4e4d-95e0-cd275160e279 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2125.167846] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e7d923b-7111-4b31-b836-6a80ce0de90c {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2125.192996] env[59577]: WARNING nova.virt.vmwareapi.driver [None req-defad58e-8b91-40ff-a486-9e79c39904f9 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Instance does not exists. Proceeding to delete instance properties on datastore: nova.exception.InstanceNotFound: Instance f515af6b-b13a-4215-88df-681172342773 could not be found. [ 2125.193213] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-defad58e-8b91-40ff-a486-9e79c39904f9 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Destroying instance {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2125.193478] env[59577]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-349690b0-737e-47d7-b111-c64f61d6f408 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2125.201157] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e49d505-ee4b-48d1-bc1e-125c0097eaa4 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2125.227961] env[59577]: WARNING nova.virt.vmwareapi.vmops [None req-defad58e-8b91-40ff-a486-9e79c39904f9 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f515af6b-b13a-4215-88df-681172342773 could not be found. [ 2125.228186] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-defad58e-8b91-40ff-a486-9e79c39904f9 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Instance destroyed {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2125.228380] env[59577]: INFO nova.compute.manager [None req-defad58e-8b91-40ff-a486-9e79c39904f9 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Took 0.07 seconds to destroy the instance on the hypervisor. [ 2125.228611] env[59577]: DEBUG oslo.service.loopingcall [None req-defad58e-8b91-40ff-a486-9e79c39904f9 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59577) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 2125.228862] env[59577]: DEBUG nova.compute.manager [-] [instance: f515af6b-b13a-4215-88df-681172342773] Deallocating network for instance {{(pid=59577) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 2125.228939] env[59577]: DEBUG nova.network.neutron [-] [instance: f515af6b-b13a-4215-88df-681172342773] deallocate_for_instance() {{(pid=59577) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2125.675484] env[59577]: DEBUG nova.network.neutron [-] [instance: f515af6b-b13a-4215-88df-681172342773] Updating instance_info_cache with network_info: [] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2125.685543] env[59577]: INFO nova.compute.manager [-] [instance: f515af6b-b13a-4215-88df-681172342773] Took 0.46 seconds to deallocate network for instance. [ 2125.689847] env[59577]: DEBUG nova.compute.manager [req-d48953c1-284a-4e35-a4c9-bc471ffffb5a req-20825cab-b14e-499e-8791-a48bcfd73961 service nova] [instance: f515af6b-b13a-4215-88df-681172342773] Received event network-vif-deleted-199687a9-cc02-408b-8e29-979f185dfcaa {{(pid=59577) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 2125.689918] env[59577]: INFO nova.compute.manager [req-d48953c1-284a-4e35-a4c9-bc471ffffb5a req-20825cab-b14e-499e-8791-a48bcfd73961 service nova] [instance: f515af6b-b13a-4215-88df-681172342773] Neutron deleted interface 199687a9-cc02-408b-8e29-979f185dfcaa; detaching it from the instance and deleting it from the info cache [ 2125.690074] env[59577]: DEBUG nova.network.neutron [req-d48953c1-284a-4e35-a4c9-bc471ffffb5a req-20825cab-b14e-499e-8791-a48bcfd73961 service nova] [instance: f515af6b-b13a-4215-88df-681172342773] Updating instance_info_cache with network_info: [] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2125.697730] env[59577]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6400a15c-bb65-428d-927a-079a15d6ea21 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2125.706626] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33af89fa-2204-4955-a91a-d2005f24541f {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2125.732474] env[59577]: DEBUG nova.compute.manager [req-d48953c1-284a-4e35-a4c9-bc471ffffb5a req-20825cab-b14e-499e-8791-a48bcfd73961 service nova] [instance: f515af6b-b13a-4215-88df-681172342773] Detach interface failed, port_id=199687a9-cc02-408b-8e29-979f185dfcaa, reason: Instance f515af6b-b13a-4215-88df-681172342773 could not be found. {{(pid=59577) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10838}} [ 2125.742871] env[59577]: INFO nova.compute.manager [None req-defad58e-8b91-40ff-a486-9e79c39904f9 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Took 0.06 seconds to detach 1 volumes for instance. [ 2125.744843] env[59577]: DEBUG nova.compute.manager [None req-defad58e-8b91-40ff-a486-9e79c39904f9 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] [instance: f515af6b-b13a-4215-88df-681172342773] Deleting volume: cae152a0-03b8-49e1-ba0e-a07439b21024 {{(pid=59577) _cleanup_volumes /opt/stack/nova/nova/compute/manager.py:3217}} [ 2125.807845] env[59577]: DEBUG oslo_concurrency.lockutils [None req-defad58e-8b91-40ff-a486-9e79c39904f9 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 2125.808113] env[59577]: DEBUG oslo_concurrency.lockutils [None req-defad58e-8b91-40ff-a486-9e79c39904f9 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 2125.808238] env[59577]: DEBUG nova.objects.instance [None req-defad58e-8b91-40ff-a486-9e79c39904f9 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Lazy-loading 'resources' on Instance uuid f515af6b-b13a-4215-88df-681172342773 {{(pid=59577) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 2125.834270] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd3260bf-7140-4917-89d4-66bb3d336b98 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2125.842567] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1a8f45b-dbb8-4f77-a65c-35e2e4c4f899 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2125.873368] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a85fba0a-f7dd-4a72-9b17-5193024fdab4 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2125.880410] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4131098e-a6df-4f35-9e9b-7ede1cd4b52e {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2125.893488] env[59577]: DEBUG nova.compute.provider_tree [None req-defad58e-8b91-40ff-a486-9e79c39904f9 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Inventory has not changed in ProviderTree for provider: cbad7164-1dca-4b60-b95b-712603801988 {{(pid=59577) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2125.902446] env[59577]: DEBUG nova.scheduler.client.report [None req-defad58e-8b91-40ff-a486-9e79c39904f9 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Inventory has not changed for provider cbad7164-1dca-4b60-b95b-712603801988 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 175, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59577) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2125.916354] env[59577]: DEBUG oslo_concurrency.lockutils [None req-defad58e-8b91-40ff-a486-9e79c39904f9 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.108s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 2125.987646] env[59577]: DEBUG oslo_concurrency.lockutils [None req-defad58e-8b91-40ff-a486-9e79c39904f9 tempest-ServerActionsV293TestJSON-829439240 tempest-ServerActionsV293TestJSON-829439240-project-member] Lock "f515af6b-b13a-4215-88df-681172342773" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.833s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 2136.199860] env[59577]: WARNING oslo_vmware.rw_handles [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2136.199860] env[59577]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2136.199860] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2136.199860] env[59577]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2136.199860] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2136.199860] env[59577]: ERROR oslo_vmware.rw_handles response.begin() [ 2136.199860] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2136.199860] env[59577]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2136.199860] env[59577]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2136.199860] env[59577]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2136.199860] env[59577]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2136.199860] env[59577]: ERROR oslo_vmware.rw_handles [ 2136.200550] env[59577]: DEBUG nova.virt.vmwareapi.images [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Downloaded image file data d5e691af-5903-46f6-a589-e220c4e5798c to vmware_temp/096fe7e2-8fb3-4bf1-aa5a-c22167d5472a/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk on the data store datastore1 {{(pid=59577) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2136.202239] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Caching image {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2136.202496] env[59577]: DEBUG nova.virt.vmwareapi.vm_util [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Copying Virtual Disk [datastore1] vmware_temp/096fe7e2-8fb3-4bf1-aa5a-c22167d5472a/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk to [datastore1] vmware_temp/096fe7e2-8fb3-4bf1-aa5a-c22167d5472a/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk {{(pid=59577) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2136.202791] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-2b8c6bac-5483-48a8-b66a-0715981e0709 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2136.211363] env[59577]: DEBUG oslo_vmware.api [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Waiting for the task: (returnval){ [ 2136.211363] env[59577]: value = "task-1933883" [ 2136.211363] env[59577]: _type = "Task" [ 2136.211363] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 2136.219266] env[59577]: DEBUG oslo_vmware.api [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Task: {'id': task-1933883, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 2136.722265] env[59577]: DEBUG oslo_vmware.exceptions [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Fault InvalidArgument not matched. {{(pid=59577) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 2136.722514] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Releasing lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 2136.723075] env[59577]: ERROR nova.compute.manager [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2136.723075] env[59577]: Faults: ['InvalidArgument'] [ 2136.723075] env[59577]: ERROR nova.compute.manager [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Traceback (most recent call last): [ 2136.723075] env[59577]: ERROR nova.compute.manager [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 2136.723075] env[59577]: ERROR nova.compute.manager [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] yield resources [ 2136.723075] env[59577]: ERROR nova.compute.manager [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 2136.723075] env[59577]: ERROR nova.compute.manager [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] self.driver.spawn(context, instance, image_meta, [ 2136.723075] env[59577]: ERROR nova.compute.manager [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 2136.723075] env[59577]: ERROR nova.compute.manager [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2136.723075] env[59577]: ERROR nova.compute.manager [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2136.723075] env[59577]: ERROR nova.compute.manager [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] self._fetch_image_if_missing(context, vi) [ 2136.723075] env[59577]: ERROR nova.compute.manager [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2136.723075] env[59577]: ERROR nova.compute.manager [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] image_cache(vi, tmp_image_ds_loc) [ 2136.723075] env[59577]: ERROR nova.compute.manager [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2136.723075] env[59577]: ERROR nova.compute.manager [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] vm_util.copy_virtual_disk( [ 2136.723075] env[59577]: ERROR nova.compute.manager [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2136.723075] env[59577]: ERROR nova.compute.manager [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] session._wait_for_task(vmdk_copy_task) [ 2136.723075] env[59577]: ERROR nova.compute.manager [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2136.723075] env[59577]: ERROR nova.compute.manager [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] return self.wait_for_task(task_ref) [ 2136.723075] env[59577]: ERROR nova.compute.manager [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2136.723075] env[59577]: ERROR nova.compute.manager [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] return evt.wait() [ 2136.723075] env[59577]: ERROR nova.compute.manager [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 2136.723075] env[59577]: ERROR nova.compute.manager [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] result = hub.switch() [ 2136.723075] env[59577]: ERROR nova.compute.manager [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 2136.723075] env[59577]: ERROR nova.compute.manager [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] return self.greenlet.switch() [ 2136.723075] env[59577]: ERROR nova.compute.manager [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2136.723075] env[59577]: ERROR nova.compute.manager [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] self.f(*self.args, **self.kw) [ 2136.723075] env[59577]: ERROR nova.compute.manager [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2136.723075] env[59577]: ERROR nova.compute.manager [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] raise exceptions.translate_fault(task_info.error) [ 2136.723075] env[59577]: ERROR nova.compute.manager [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2136.723075] env[59577]: ERROR nova.compute.manager [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Faults: ['InvalidArgument'] [ 2136.723075] env[59577]: ERROR nova.compute.manager [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] [ 2136.724353] env[59577]: INFO nova.compute.manager [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Terminating instance [ 2136.724898] env[59577]: DEBUG oslo_concurrency.lockutils [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Acquired lock "[datastore1] devstack-image-cache_base/d5e691af-5903-46f6-a589-e220c4e5798c/d5e691af-5903-46f6-a589-e220c4e5798c.vmdk" {{(pid=59577) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 2136.725122] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2136.725347] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7739e188-81c1-4d47-b24c-e6b4415d3e23 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2136.727410] env[59577]: DEBUG nova.compute.manager [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Start destroying the instance on the hypervisor. {{(pid=59577) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 2136.727599] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Destroying instance {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2136.728289] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14e30578-b08c-42d6-8b03-bad5882a99a5 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2136.734881] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Unregistering the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2136.735093] env[59577]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-63110776-9b5a-4722-a997-0bc6aec5d481 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2136.737045] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2136.737221] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59577) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2136.738117] env[59577]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-440f4e50-19d2-4927-8ef3-b47c6e941568 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2136.742675] env[59577]: DEBUG oslo_vmware.api [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Waiting for the task: (returnval){ [ 2136.742675] env[59577]: value = "session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]52a39ee9-6f09-b3f5-5cc1-9274a8c961ff" [ 2136.742675] env[59577]: _type = "Task" [ 2136.742675] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 2136.750138] env[59577]: DEBUG oslo_vmware.api [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Task: {'id': session[5273dbe9-9a5e-ac6d-cc2a-4d8cae92386a]52a39ee9-6f09-b3f5-5cc1-9274a8c961ff, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 2136.810478] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Unregistered the VM {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2136.810689] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Deleting contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2136.810906] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Deleting the datastore file [datastore1] 5af13348-9f89-44b2-93bd-f9fb91598c73 {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2136.811154] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4e171223-f8d9-4016-8223-507092180bf1 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2136.817136] env[59577]: DEBUG oslo_vmware.api [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Waiting for the task: (returnval){ [ 2136.817136] env[59577]: value = "task-1933885" [ 2136.817136] env[59577]: _type = "Task" [ 2136.817136] env[59577]: } to complete. {{(pid=59577) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 2136.824575] env[59577]: DEBUG oslo_vmware.api [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Task: {'id': task-1933885, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 2137.253553] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: 56259797-6883-437c-8942-5beca0e1ef7b] Preparing fetch location {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2137.253893] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Creating directory with path [datastore1] vmware_temp/92cc5d04-8850-415d-9ab5-17883695e031/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2137.254042] env[59577]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6aa8a4bd-770a-4def-983d-b4fdc7a5405a {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2137.266229] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Created directory with path [datastore1] vmware_temp/92cc5d04-8850-415d-9ab5-17883695e031/d5e691af-5903-46f6-a589-e220c4e5798c {{(pid=59577) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2137.266415] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: 56259797-6883-437c-8942-5beca0e1ef7b] Fetch image to [datastore1] vmware_temp/92cc5d04-8850-415d-9ab5-17883695e031/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk {{(pid=59577) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2137.266579] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: 56259797-6883-437c-8942-5beca0e1ef7b] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to [datastore1] vmware_temp/92cc5d04-8850-415d-9ab5-17883695e031/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk on the data store datastore1 {{(pid=59577) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2137.267283] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7ff6ddf-fb60-4bf7-b2a5-8e3d0f57784c {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2137.273541] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e12483e7-f653-4a02-9a29-4e1756cb6259 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2137.282174] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c93de80-02af-43a3-a8aa-584e8860f177 {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2137.311744] env[59577]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1d195e0-7e6d-4ef7-a0b5-5977d8e3561e {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2137.316863] env[59577]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-8c284796-a562-4081-90ae-dd95e4f7943d {{(pid=59577) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 2137.325809] env[59577]: DEBUG oslo_vmware.api [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Task: {'id': task-1933885, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.062526} completed successfully. {{(pid=59577) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 2137.326033] env[59577]: DEBUG nova.virt.vmwareapi.ds_util [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Deleted the datastore file {{(pid=59577) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2137.326222] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Deleted contents of the VM from datastore datastore1 {{(pid=59577) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2137.326385] env[59577]: DEBUG nova.virt.vmwareapi.vmops [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Instance destroyed {{(pid=59577) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2137.326558] env[59577]: INFO nova.compute.manager [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2137.328767] env[59577]: DEBUG nova.compute.claims [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Aborting claim: {{(pid=59577) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2137.328959] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 2137.329205] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 2137.339461] env[59577]: DEBUG nova.virt.vmwareapi.images [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] [instance: 56259797-6883-437c-8942-5beca0e1ef7b] Downloading image file data d5e691af-5903-46f6-a589-e220c4e5798c to the data store datastore1 {{(pid=59577) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2137.354586] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.025s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 2137.355300] env[59577]: DEBUG nova.compute.utils [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Instance 5af13348-9f89-44b2-93bd-f9fb91598c73 could not be found. {{(pid=59577) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2137.357216] env[59577]: DEBUG nova.compute.manager [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Instance disappeared during build. {{(pid=59577) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 2137.357392] env[59577]: DEBUG nova.compute.manager [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Unplugging VIFs for instance {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 2137.357570] env[59577]: DEBUG nova.compute.manager [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59577) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 2137.357737] env[59577]: DEBUG nova.compute.manager [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Deallocating network for instance {{(pid=59577) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 2137.357897] env[59577]: DEBUG nova.network.neutron [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] deallocate_for_instance() {{(pid=59577) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2137.387551] env[59577]: DEBUG oslo_vmware.rw_handles [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/92cc5d04-8850-415d-9ab5-17883695e031/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59577) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 2137.389243] env[59577]: DEBUG nova.network.neutron [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Updating instance_info_cache with network_info: [] {{(pid=59577) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2137.442070] env[59577]: INFO nova.compute.manager [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] [instance: 5af13348-9f89-44b2-93bd-f9fb91598c73] Took 0.08 seconds to deallocate network for instance. [ 2137.447170] env[59577]: DEBUG oslo_vmware.rw_handles [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Completed reading data from the image iterator. {{(pid=59577) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 2137.447170] env[59577]: DEBUG oslo_vmware.rw_handles [None req-ee1c8e07-7f65-4a0e-872b-b1e93a6b4dc8 tempest-ServersTestJSON-1267623694 tempest-ServersTestJSON-1267623694-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/92cc5d04-8850-415d-9ab5-17883695e031/d5e691af-5903-46f6-a589-e220c4e5798c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59577) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 2137.483148] env[59577]: DEBUG oslo_concurrency.lockutils [None req-3ce804c3-c574-4762-b54f-3791ec557de2 tempest-AttachInterfacesTestJSON-87292261 tempest-AttachInterfacesTestJSON-87292261-project-member] Lock "5af13348-9f89-44b2-93bd-f9fb91598c73" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 291.868s {{(pid=59577) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 2147.046393] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 2151.045252] env[59577]: DEBUG oslo_service.periodic_task [None req-7dec82ad-57db-4310-a42d-539968646519 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59577) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}}